WorldWideScience

Sample records for specifically computer development

  1. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn

    2009-01-01

    An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design that add......An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design...... that addresses the shaping of interactions among people, place-specific resources and global socio-technical networks, mediated by digital technology, and influenced by the structuring conditions of place. The theoretical grounding for place-specific computing is located in the meeting between conceptions...... of place in human geography and recent research in interaction design focusing on embodied interaction. Central themes in this grounding revolve around place and its relation to embodiment and practice, as well as the social, cultural and material aspects conditioning the enactment of place. Selected...

  2. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  3. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  4. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  5. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  6. Computational biomechanics for medicine fundamental science and patient-specific applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2014-01-01

    One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This latest installment comprises nine of the latest developments in both fundamental science and patient-specific applications, from researchers in Australia, New Zealand, USA, UK, France, Ireland, and China. Some of the interesting topics discussed are: cellular mechanics; tumor growth and modeling; medical image analysis; and both patient-specific fluid dynamics and solid mechanics simulations.

  7. Specifics of computer discourse translation from English into Russian

    African Journals Online (AJOL)

    Specifics of computer discourse translation from English into Russian. ... of further development of science and technology in Russia and abroad and it inevitably ... The article may be useful for IT teachers when preparing teaching aids and ...

  8. The specification of Stampi, a message passing library for distributed parallel computing

    International Nuclear Information System (INIS)

    Imamura, Toshiyuki; Takemiya, Hiroshi; Koide, Hiroshi

    2000-03-01

    At CCSE, Center for Promotion of Computational Science and Engineering, a new message passing library for heterogeneous and distributed parallel computing has been developed, and it is called as Stampi. Stampi enables us to communicate between any combination of parallel computers as well as workstations. Currently, a Stampi system is constructed from Stampi library and Stampi/Java. It provides functions to connect a Stampi application with not only those on COMPACS, COMplex Parallel Computer System, but also applets which work on WWW browsers. This report summarizes the specifications of Stampi and details the development of its system. (author)

  9. A climatological model for risk computations incorporating site- specific dry deposition influences

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.

    1991-07-01

    A gradient-flux dry deposition module was developed for use in a climatological atmospheric transport model, the Multimedia Environmental Pollutant Assessment System (MEPAS). The atmospheric pathway model computes long-term average contaminant air concentration and surface deposition patterns surrounding a potential release site incorporating location-specific dry deposition influences. Gradient-flux formulations are used to incorporate site and regional data in the dry deposition module for this atmospheric sector-average climatological model. Application of these formulations provide an effective means of accounting for local surface roughness in deposition computations. Linkage to a risk computation module resulted in a need for separate regional and specific surface deposition computations. 13 refs., 4 figs., 2 tabs

  10. Design and development of semantic web-based system for computer science domain-specific information retrieval

    Directory of Open Access Journals (Sweden)

    Ritika Bansal

    2016-09-01

    Full Text Available In semantic web-based system, the concept of ontology is used to search results by contextual meaning of input query instead of keyword matching. From the research literature, there seems to be a need for a tool which can provide an easy interface for complex queries in natural language that can retrieve the domain-specific information from the ontology. This research paper proposes an IRSCSD system (Information retrieval system for computer science domain as a solution. This system offers advanced querying and browsing of structured data with search results automatically aggregated and rendered directly in a consistent user-interface, thus reducing the manual effort of users. So, the main objective of this research is design and development of semantic web-based system for integrating ontology towards domain-specific retrieval support. Methodology followed is a piecemeal research which involves the following stages. First Stage involves the designing of framework for semantic web-based system. Second stage builds the prototype for the framework using Protégé tool. Third Stage deals with the natural language query conversion into SPARQL query language using Python-based QUEPY framework. Fourth Stage involves firing of converted SPARQL queries to the ontology through Apache's Jena API to fetch the results. Lastly, evaluation of the prototype has been done in order to ensure its efficiency and usability. Thus, this research paper throws light on framework development for semantic web-based system that assists in efficient retrieval of domain-specific information, natural language query interpretation into semantic web language, creation of domain-specific ontology and its mapping with related ontology. This research paper also provides approaches and metrics for ontology evaluation on prototype ontology developed to study the performance based on accessibility of required domain-related information.

  11. Optoelectronic Computer Architecture Development for Image Reconstruction

    National Research Council Canada - National Science Library

    Forber, Richard

    1996-01-01

    .... Specifically, we collaborated with UCSD and ERIM on the development of an optically augmented electronic computer for high speed inverse transform calculations to enable real time image reconstruction...

  12. Computer-aided System of Semantic Text Analysis of a Technical Specification

    OpenAIRE

    Zaboleeva-Zotova, Alla; Orlova, Yulia

    2008-01-01

    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated the model of the analysis of the text of the technical project is submitted, the attribute grammar of a technical specification, intended for formalization of limited Ru...

  13. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  14. One Head Start Classroom's Experience: Computers and Young Children's Development.

    Science.gov (United States)

    Fischer, Melissa Anne; Gillespie, Catherine Wilson

    2003-01-01

    Contends that early childhood educators need to understand how exposure to computers and constructive computer programs affects the development of children. Specifically examines: (1) research on children's technology experiences; (2) determining best practices; and (3) addressing educators' concerns about computers replacing other developmentally…

  15. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han; Yang, Yong Liang; Bao, Fan; Fink, Daniel; Yan, Dongming; Wonka, Peter; Mitra, Niloy J.

    2016-01-01

    of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications

  16. Developing a Distributed Computing Architecture at Arizona State University.

    Science.gov (United States)

    Armann, Neil; And Others

    1994-01-01

    Development of Arizona State University's computing architecture, designed to ensure that all new distributed computing pieces will work together, is described. Aspects discussed include the business rationale, the general architectural approach, characteristics and objectives of the architecture, specific services, and impact on the university…

  17. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  18. Computer Support of Semantic Text Analysis of a Technical Specification on Designing Software

    OpenAIRE

    Zaboleeva-Zotova, Alla; Orlova, Yulia

    2009-01-01

    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated a technique of the text analysis of a technical specification is submitted, the expanded fuzzy attribute grammar of a technical specification, intended for formaliza...

  19. Subject-specific computer simulation model for determining elbow loading in one-handed tennis backhand groundstrokes.

    Science.gov (United States)

    King, Mark A; Glynn, Jonathan A; Mitchell, Sean R

    2011-11-01

    A subject-specific angle-driven computer model of a tennis player, combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts, was developed to determine the effect of ball-racket impacts on loading at the elbow for one-handed backhand groundstrokes. Matching subject-specific computer simulations of a typical topspin/slice one-handed backhand groundstroke performed by an elite tennis player were done with root mean square differences between performance and matching simulations of elbow loading for a topspin and slice one-handed backhand groundstroke is relatively small. In this study, the relatively small differences in elbow loading may be due to comparable angle-time histories at the wrist and elbow joints with the major kinematic differences occurring at the shoulder. Using a subject-specific angle-driven computer model combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts allows peak internal loading, net impulse, and shock due to ball-racket impact to be calculated which would not otherwise be possible without impractical invasive techniques. This study provides a basis for further investigation of the factors that may increase elbow loading during tennis strokes.

  20. Development of a computationally-designed polymeric adsorbent specific for mycotoxin patulin.

    Science.gov (United States)

    Piletska, Elena V; Pink, Demi; Karim, Kal; Piletsky, Sergey A

    2017-12-04

    Patulin is a toxic compound which is found predominantly in apples affected by mould rot. Since apples and apple-containing products are a popular food for the elderly, children and babies, the monitoring of the toxin is crucial. This paper describes a development of a computationally-designed polymeric adsorbent for the solid-phase extraction of patulin, which provides an effective clean-up of the food samples and allows the detection and accurate quantification of patulin levels present in apple juice using conventional chromatography methods. The developed bespoke polymer demonstrates a quantitative binding towards the patulin present in undiluted apple juice. The polymer is inexpensive and easy to mass-produce. The contributing factors to the function of the adsorbent is a combination of acidic and basic functional monomers producing a zwitterionic complex in the solution that formed stronger binding complexes with the patulin molecule. The protocols described in this paper provide a blueprint for the development of polymeric adsorbents for other toxins or different food matrices.

  1. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  2. Cognitive training in Parkinson disease: cognition-specific vs nonspecific computer training.

    Science.gov (United States)

    Zimmermann, Ronan; Gschwandtner, Ute; Benz, Nina; Hatz, Florian; Schindler, Christian; Taub, Ethan; Fuhr, Peter

    2014-04-08

    In this study, we compared a cognition-specific computer-based cognitive training program with a motion-controlled computer sports game that is not cognition-specific for their ability to enhance cognitive performance in various cognitive domains in patients with Parkinson disease (PD). Patients with PD were trained with either a computer program designed to enhance cognition (CogniPlus, 19 patients) or a computer sports game with motion-capturing controllers (Nintendo Wii, 20 patients). The effect of training in 5 cognitive domains was measured by neuropsychological testing at baseline and after training. Group differences over all variables were assessed with multivariate analysis of variance, and group differences in single variables were assessed with 95% confidence intervals of mean difference. The groups were similar regarding age, sex, and educational level. Patients with PD who were trained with Wii for 4 weeks performed better in attention (95% confidence interval: -1.49 to -0.11) than patients trained with CogniPlus. In our study, patients with PD derived at least the same degree of cognitive benefit from non-cognition-specific training involving movement as from cognition-specific computerized training. For patients with PD, game consoles may be a less expensive and more entertaining alternative to computer programs specifically designed for cognitive training. This study provides Class III evidence that, in patients with PD, cognition-specific computer-based training is not superior to a motion-controlled computer game in improving cognitive performance.

  3. New developments in the CREAM Computing Element

    International Nuclear Information System (INIS)

    Andreetto, Paolo; Bertocco, Sara; Dorigo, Alvise; Capannini, Fabio; Cecchi, Marco; Zangrando, Luigi

    2012-01-01

    The EU-funded project EMI aims at providing a unified, standardized, easy to install software for distributed computing infrastructures. CREAM is one of the middleware products part of the EMI middleware distribution: it implements a Grid job management service which allows the submission, management and monitoring of computational jobs to local resource management systems. In this paper we discuss about some new features being implemented in the CREAM Computing Element. The implementation of the EMI Execution Service (EMI-ES) specification (an agreement in the EMI consortium on interfaces and protocols to be used in order to enable computational job submission and management required across technologies) is one of the new functions being implemented. New developments are also focusing in the High Availability (HA) area, to improve performance, scalability, availability and fault tolerance.

  4. Computed Tomography Technology: Development and Applications for Defence

    International Nuclear Information System (INIS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-01-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT and E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  5. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  6. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  7. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  8. Modeling the Development of Goal-Specificity in Mirror Neurons.

    Science.gov (United States)

    Thill, Serge; Svensson, Henrik; Ziemke, Tom

    2011-12-01

    Neurophysiological studies have shown that parietal mirror neurons encode not only actions but also the goal of these actions. Although some mirror neurons will fire whenever a certain action is perceived (goal-independently), most will only fire if the motion is perceived as part of an action with a specific goal. This result is important for the action-understanding hypothesis as it provides a potential neurological basis for such a cognitive ability. It is also relevant for the design of artificial cognitive systems, in particular robotic systems that rely on computational models of the mirror system in their interaction with other agents. Yet, to date, no computational model has explicitly addressed the mechanisms that give rise to both goal-specific and goal-independent parietal mirror neurons. In the present paper, we present a computational model based on a self-organizing map, which receives artificial inputs representing information about both the observed or executed actions and the context in which they were executed. We show that the map develops a biologically plausible organization in which goal-specific mirror neurons emerge. We further show that the fundamental cause for both the appearance and the number of goal-specific neurons can be found in geometric relationships between the different inputs to the map. The results are important to the action-understanding hypothesis as they provide a mechanism for the emergence of goal-specific parietal mirror neurons and lead to a number of predictions: (1) Learning of new goals may mostly reassign existing goal-specific neurons rather than recruit new ones; (2) input differences between executed and observed actions can explain observed corresponding differences in the number of goal-specific neurons; and (3) the percentage of goal-specific neurons may differ between motion primitives.

  9. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han

    2016-07-11

    Connectivity and layout of underlying networks largely determine agent behavior and usage in many environments. For example, transportation networks determine the flow of traffic in a neighborhood, whereas building floorplans determine the flow of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications. Such specifications can be in the form of network density, travel time versus network length, traffic type, destination location, etc. We propose an integer programming-based approach that guarantees that the resultant networks are valid by fulfilling all the specified hard constraints and that they score favorably in terms of the objective function. We evaluate our algorithm in two different design settings, street layout and floorplans to demonstrate that diverse networks can emerge purely from high-level functional specifications.

  10. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  11. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  12. Embedded Volttron specification - benchmarking small footprint compute device for Volttron

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Woodworth, Ken [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kuruganti, Teja [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-17

    An embedded system is a small footprint computing unit that typically serves a specific purpose closely associated with measurements and control of hardware devices. These units are designed for reasonable durability and operations in a wide range of operating conditions. Some embedded systems support real-time operations and can demonstrate high levels of reliability. Many have failsafe mechanisms built to handle graceful shutdown of the device in exception conditions. The available memory, processing power, and network connectivity of these devices are limited due to the nature of their specific-purpose design and intended application. Industry practice is to carefully design the software for the available hardware capability to suit desired deployment needs. Volttron is an open source agent development and deployment platform designed to enable researchers to interact with devices and appliances without having to write drivers themselves. Hosting Volttron on small footprint embeddable devices enables its demonstration for embedded use. This report details the steps required and the experience in setting up and running Volttron applications on three small footprint devices: the Intel Next Unit of Computing (NUC), the Raspberry Pi 2, and the BeagleBone Black. In addition, the report also details preliminary investigation of the execution performance of Volttron on these devices.

  13. Improving developer productivity with C++ embedded domain specific languages

    Science.gov (United States)

    Kozacik, Stephen; Chao, Evenie; Paolini, Aaron; Bonnett, James; Kelmelis, Eric

    2017-05-01

    Domain-specific languages are a useful tool for productivity allowing domain experts to program using familiar concepts and vocabulary while benefiting from performance choices made by computing experts. Embedding the domain specific language into an existing language allows easy interoperability with non-domain-specific code and use of standard compilers and build systems. In C++, this is enabled through the template and preprocessor features. C++ embedded domain specific languages (EDSLs) allow the user to write simple, safe, performant, domain specific code that has access to all the low-level functionality that C and C++ offer as well as the diverse set of libraries available in the C/C++ ecosystem. In this paper, we will discuss several tools available for building EDSLs in C++ and show examples of projects successfully leveraging EDSLs. Modern C++ has added many useful new features to the language which we have leveraged to further extend the capability of EDSLs. At EM Photonics, we have used EDSLs to allow developers to transparently benefit from using high performance computing (HPC) hardware. We will show ways EDSLs combine with existing technologies and EM Photonics high performance tools and libraries to produce clean, short, high performance code in ways that were not previously possible.

  14. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  15. A Computational Framework to Optimize Subject-Specific Hemodialysis Blood Flow Rate to Prevent Intimal Hyperplasia

    Science.gov (United States)

    Mahmoudzadeh, Javid; Wlodarczyk, Marta; Cassel, Kevin

    2017-11-01

    Development of excessive intimal hyperplasia (IH) in the cephalic vein of renal failure patients who receive chronic hemodialysis treatment results in vascular access failure and multiple treatment complications. Specifically, cephalic arch stenosis (CAS) is known to exacerbate hypertensive blood pressure, thrombosis, and subsequent cardiovascular incidents that would necessitate costly interventional procedures with low success rates. It has been hypothesized that excessive blood flow rate post access maturation which strongly violates the venous homeostasis is the main hemodynamic factor that orchestrates the onset and development of CAS. In this article, a computational framework based on a strong coupling of computational fluid dynamics (CFD) and shape optimization is proposed that aims to identify the effective blood flow rate on a patient-specific basis that avoids the onset of CAS while providing the adequate blood flow rate required to facilitate hemodialysis. This effective flow rate can be achieved through implementation of Miller's surgical banding method after the maturation of the arteriovenous fistula and is rooted in the relaxation of wall stresses back to a homeostatic target value. The results are indicative that this optimized hemodialysis blood flow rate is, in fact, a subject-specific value that can be assessed post vascular access maturation and prior to the initiation of chronic hemodialysis treatment as a mitigative action against CAS-related access failure. This computational technology can be employed for individualized dialysis treatment.

  16. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  17. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  18. Cloud Computing and Agile Organization Development

    Directory of Open Access Journals (Sweden)

    Bogdan GHILIC-MICU

    2014-01-01

    Full Text Available In the 3rd millennium economy, defined by globalization and continuous reduction of natural resources, the economic organization becomes the main actor in the phenomenon of transfor-mation and adaptation to new conditions. Even more, the economic environment, which is closely related to the social environment, undergoes complex metamorphoses, especially in the management area. In this dynamic and complex social and environmental context, the econom-ic organization must possess the ability to adapt, becoming a flexible and agile answer to new market opportunities. Considering the spectacular evolution of information and communica-tions technology, one of the solutions to ensure organization agility is cloud computing. Just like the development of any science requires adaptation to theories and instruments specific to other fields, a cloud computing paradigm for the agile organization must appeal to models from management, cybernetics, mathematics, structuralism and information theory (or information systems theory.

  19. Computationally Developed Sham Stimulation Protocol for Multichannel Desynchronizing Stimulation

    Directory of Open Access Journals (Sweden)

    Magteld Zeitler

    2018-05-01

    Full Text Available A characteristic pattern of abnormal brain activity is abnormally strong neuronal synchronization, as found in several brain disorders, such as tinnitus, Parkinson's disease, and epilepsy. As observed in several diseases, different therapeutic interventions may induce a placebo effect that may be strong and hinder reliable clinical evaluations. Hence, to distinguish between specific, neuromodulation-induced effects and unspecific, placebo effects, it is important to mimic the therapeutic procedure as precisely as possibly, thereby providing controls that actually lack specific effects. Coordinated Reset (CR stimulation has been developed to specifically counteract abnormally strong synchronization by desynchronization. CR is a spatio-temporally patterned multichannel stimulation which reduces the extent of coincident neuronal activity and aims at an anti-kindling, i.e., an unlearning of both synaptic connectivity and neuronal synchrony. Apart from acute desynchronizing effects, CR may cause sustained, long-lasting desynchronizing effects, as already demonstrated in pre-clinical and clinical proof of concept studies. In this computational study, we set out to computationally develop a sham stimulation protocol for multichannel desynchronizing stimulation. To this end, we compare acute effects and long-lasting effects of six different spatio-temporally patterned stimulation protocols, including three variants of CR, using a no-stimulation condition as additional control. This is to provide an inventory of different stimulation algorithms with similar fundamental stimulation parameters (e.g., mean stimulation rates but qualitatively different acute and/or long-lasting effects. Stimulation protocols sharing basic parameters, but inducing nevertheless completely different or even no acute effects and/or after-effects, might serve as controls to validate the specific effects of particular desynchronizing protocols such as CR. In particular, based on

  20. The Effect of Inlet Waveforms on Computational Hemodynamics of Patient-Specific Intracranial Aneurysms

    OpenAIRE

    Xiang, J.; Siddiqui, A.H.; Meng, H.

    2014-01-01

    Due to the lack of patient-specific inlet flow waveform measurements, most computational fluid dynamics (CFD) simulations of intracranial aneurysms usually employ waveforms that are not patient-specific as inlet boundary conditions for the computational model. The current study examined how this assumption affects the predicted hemodynamics in patient-specific aneurysm geometries. We examined wall shear stress (WSS) and oscillatory shear index (OSI), the two most widely studied hemodynamic qu...

  1. Dimensionally Specific Capture of Attention: Implications for Saliency Computation

    Directory of Open Access Journals (Sweden)

    Katherine E. Burnett

    2018-02-01

    Full Text Available Observers automatically orient to a sudden change in the environment. This is demonstrated experimentally using exogenous cues, which prioritize the analysis of subsequent targets appearing nearby. This effect has been attributed to the computation of saliency, obtained by combining features specific signals, which then feed back to drive attention to the salient location. An alternative possibility is that cueing directly effects target-evoked sensory responses in a feed-forward manner. We examined the effects of luminance and equiluminant color cues in a dual task paradigm, which required both a motion and a color discrimination. Equiluminant color cues improved color discrimination more than luminance cues, but luminance cues improved motion discrimination more than equiluminant color cues. This suggests that the effects of exogenous cues are dimensionally specific and may not depend entirely on the computation of a dimension general saliency signal.

  2. Computer-Aided Sensor Development Focused on Security Issues.

    Science.gov (United States)

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  3. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  4. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  5. Developments of the general computer network of NIPNE-HH

    International Nuclear Information System (INIS)

    Mirica, M.; Constantinescu, S.; Danet, A.

    1997-01-01

    Since 1991 the general computer network of NIPNE-HH was developed and connected to RNCN (Romanian National Computer Network) for research and development and it offers to the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNCN is targeted on the following main objectives: Setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - Providing a rapid and competitive tool for the exchange of information in the framework of Research and Development (R-D) community; - Using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - Providing a support for information, scientific and technical co-operation. RNCN has two international links: to EBONE via ACONET (64kbps) and to EuropaNET via Hungarnet (64 kbps). The guiding principle in designing the project of general computer network of NIPNE-HH, as part of RNCN, was to implement an open system based on OSI standards taking into account the following criteria: - development of a flexible solution, according to OSI specifications; - solutions of reliable gateway with the existing network already in use,allowing the access to the worldwide networks; - using the TCP/IP transport protocol for each Local Area Network (LAN) and for the connection to RNCN; - ensuring the integration of different and heterogeneous software and hardware platforms (DOS, Windows, UNIX, VMS, Linux, etc) through some specific interfaces. The major objectives achieved in direction of developing the general computer network of NIPNE-HH are: - linking all the existing and newly installed computer equipment and providing an adequate connectivity. LANs from departments

  6. Computer-Aided Sensor Development Focused on Security Issues

    Directory of Open Access Journals (Sweden)

    Andrzej Bialas

    2016-05-01

    Full Text Available The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  7. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial...... on secret values and results are only revealed according to specific protocols. We identify the key linguistic concepts of SMC and bridge the gap between high-level security requirements and low-level cryptographic operations constituting an SMC platform, thus improving the efficiency and security of SMC...

  8. Assessment of CT dose to the fetus and pregnant female patient using patient-specific computational models

    DEFF Research Database (Denmark)

    Xie, Tianwu; Poletti, Pierre-Alexandre; Platon, Alexandra

    2018-01-01

    of pregnant patients and the embedded foetus, we developed a methodology for construction of patient-specific voxel-based computational phantoms based on existing standardised hybrid computational pregnant female phantoms. We estimated the maternal absorbed dose and foetal organ dose for 30 pregnant patients...... for assessment of the radiation risks to pregnant patients and the foetus from various CT scanning protocols, thus guiding the decision-making process. KEY POINTS: • In CT examinations, the absorbed dose is non-uniformly distributed within foetal organs. • This work reports, for the first time, estimates...

  9. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  10. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Moses, Alan M.; Zhang, Zhaolei

    2015-01-01

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  11. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun

    2015-11-02

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  12. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  13. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  14. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    Science.gov (United States)

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  15. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  16. Development of a solar-powered residential air conditioner: System optimization preliminary specification

    Science.gov (United States)

    Rousseau, J.; Hwang, K. C.

    1975-01-01

    Investigations aimed at the optimization of a baseline Rankine cycle solar powered air conditioner and the development of a preliminary system specification were conducted. Efforts encompassed the following: (1) investigations of the use of recuperators/regenerators to enhance the performance of the baseline system, (2) development of an off-design computer program for system performance prediction, (3) optimization of the turbocompressor design to cover a broad range of conditions and permit operation at low heat source water temperatures, (4) generation of parametric data describing system performance (COP and capacity), (5) development and evaluation of candidate system augmentation concepts and selection of the optimum approach, (6) generation of auxiliary power requirement data, (7) development of a complete solar collector-thermal storage-air conditioner computer program, (8) evaluation of the baseline Rankine air conditioner over a five day period simulating the NASA solar house operation, and (9) evaluation of the air conditioner as a heat pump.

  17. Development of a system of computer codes for severe accident analyses and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1991-12-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy.

  18. Development of a system of computer codes for severe accident analyses and its applications

    International Nuclear Information System (INIS)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan

    1991-12-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy

  19. Impact of Computer Aided Learning on Children with Specific Learning Disabilities

    OpenAIRE

    The Spastic Society Of Karnataka , Bangalore

    2004-01-01

    Study conducted by The Spastics Society of Karnataka on behalf of Azim Premji Foundation to assess the effectiveness of computers in enhancing learning for children with specific learning disabilities. Azim Premji Foundation is not liable for any direct or indirect loss or damage whatsoever arising from the use or access of any information, interpretation and conclusions that may be printed in this report.; Study to assess the effectiveness of computers in enhancing learning for children with...

  20. Computational identification of strain-, species- and genus-specific proteins

    Directory of Open Access Journals (Sweden)

    Thiagarajan Rathi

    2005-11-01

    Full Text Available Abstract Background The identification of unique proteins at different taxonomic levels has both scientific and practical value. Strain-, species- and genus-specific proteins can provide insight into the criteria that define an organism and its relationship with close relatives. Such proteins can also serve as taxon-specific diagnostic targets. Description A pipeline using a combination of computational and manual analyses of BLAST results was developed to identify strain-, species-, and genus-specific proteins and to catalog the closest sequenced relative for each protein in a proteome. Proteins encoded by a given strain are preliminarily considered to be unique if BLAST, using a comprehensive protein database, fails to retrieve (with an e-value better than 0.001 any protein not encoded by the query strain, species or genus (for strain-, species- and genus-specific proteins respectively, or if BLAST, using the best hit as the query (reverse BLAST, does not retrieve the initial query protein. Results are manually inspected for homology if the initial query is retrieved in the reverse BLAST but is not the best hit. Sequences unlikely to retrieve homologs using the default BLOSUM62 matrix (usually short sequences are re-tested using the PAM30 matrix, thereby increasing the number of retrieved homologs and increasing the stringency of the search for unique proteins. The above protocol was used to examine several food- and water-borne pathogens. We find that the reverse BLAST step filters out about 22% of proteins with homologs that would otherwise be considered unique at the genus and species levels. Analysis of the annotations of unique proteins reveals that many are remnants of prophage proteins, or may be involved in virulence. The data generated from this study can be accessed and further evaluated from the CUPID (Core and Unique Protein Identification system web site (updated semi-annually at http://pir.georgetown.edu/cupid. Conclusion CUPID

  1. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  2. Finding-specific display presets for computed radiography soft-copy reading.

    Science.gov (United States)

    Andriole, K P; Gould, R G; Webb, W R

    1999-05-01

    Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding-specific

  3. High-resolution subject-specific mitral valve imaging and modeling: experimental and computational methods.

    Science.gov (United States)

    Toma, Milan; Bloodworth, Charles H; Einstein, Daniel R; Pierce, Eric L; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S

    2016-12-01

    The diversity of mitral valve (MV) geometries and multitude of surgical options for correction of MV diseases necessitates the use of computational modeling. Numerical simulations of the MV would allow surgeons and engineers to evaluate repairs, devices, procedures, and concepts before performing them and before moving on to more costly testing modalities. Constructing, tuning, and validating these models rely upon extensive in vitro characterization of valve structure, function, and response to change due to diseases. Micro-computed tomography ([Formula: see text]CT) allows for unmatched spatial resolution for soft tissue imaging. However, it is still technically challenging to obtain an accurate geometry of the diastolic MV. We discuss here the development of a novel technique for treating MV specimens with glutaraldehyde fixative in order to minimize geometric distortions in preparation for [Formula: see text]CT scanning. The technique provides a resulting MV geometry which is significantly more detailed in chordal structure, accurate in leaflet shape, and closer to its physiological diastolic geometry. In this paper, computational fluid-structure interaction (FSI) simulations are used to show the importance of more detailed subject-specific MV geometry with 3D chordal structure to simulate a proper closure validated against [Formula: see text]CT images of the closed valve. Two computational models, before and after use of the aforementioned technique, are used to simulate closure of the MV.

  4. Latest developments for a computer aided thermohydraulic network

    International Nuclear Information System (INIS)

    Alemberti, A.; Graziosi, G.; Mini, G.; Susco, M.

    1999-01-01

    Thermohydraulic networks are I-D systems characterized by a small number of basic components (pumps, valves, heat exchangers, etc) connected by pipes and limited spatially by a defined number of boundary conditions (tanks, atmosphere, etc). The network system is simulated by the well known computer program RELAPS/mod3. Information concerning the network geometry component behaviour, initial and boundary conditions are usually supplied to the RELAPS code using an ASCII input file by means of 'input cards'. CATNET (Computer Aided Thermalhydraulic NETwork) is a graphically user interface that, under specific user guidelines which completely define its range of applicability, permits a very high level of standardization and simplification of the RELAPS/mod3 input deck development process as well as of the output processing. The characteristics of the components (pipes, valves, pumps etc), defining the network system can be entered through CATNET. The CATNET interface is provided by special functions to compute form losses in the most typical bending and branching configurations. When the input of all system components is ready, CATNET is able to generate the RELAPS/mod3 input file. Finally, by means of CATNET, the RELAPS/mod3 code can be run and its output results can be transformed to an intuitive display form. The paper presents an example of application of the CATNET interface as well as the latest developments which greatly simplified the work of the users and allowed to reduce the possibility of input errors. (authors)

  5. Development of personnel exposure management system with personal computer

    International Nuclear Information System (INIS)

    Yamato, Ichiro; Yamamoto, Toshiki

    1992-01-01

    In nuclear power plants, large scale personnel exposure management systems have been developed and established by utilities. Though being common in the base, the implementations are specific by plants. Contractors must control their workers' exposures by their own methods and systems. To comply with the utilities' parental systems, contractors' systems tend to differ by plants, thus make it difficult for contractors to design a standard system that is common to all relevant plants. Circumstances being as such, however, we have developed a system which is applicable to various customer utilities with minimal variations, using personal computers with database management and data communication softwares, with relatively low cost. We hope that this system will develop to the standard model for all Japanese contractors' personnel exposure management systems. (author)

  6. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  7. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  8. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  9. The Effects of Computer Graphic Organizers on the Persuasive Writing of Hispanic Middle School Students with Specific Learning Disabilities

    Science.gov (United States)

    Unzueta, Caridad H.; Barbetta, Patricia M.

    2012-01-01

    A multiple baseline design investigated the effects of computer graphic organizers on the persuasive composition writing skills of four Hispanic students with specific learning disabilities. Participants reviewed the elements of persuasive writing and then developed compositions using a word processing program. Baseline planning was done with a…

  10. Assessment of CT dose to the fetus and pregnant female patient using patient-specific computational models

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu; Poletti, Pierre-Alexandre; Platon, Alexandra; Becker, Christoph D. [Geneva University Hospital, Department of Medical Imaging and Information Sciences, Geneva (Switzerland); Zaidi, Habib [Geneva University Hospital, Department of Medical Imaging and Information Sciences, Geneva (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); University Medical Center Groningen, Department of Nuclear Medicine and Molecular Imaging, University of Groningen, Groningen (Netherlands); University of Southern Denmark, Department of Nuclear Medicine, Odense (Denmark); Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva (Switzerland)

    2018-03-15

    This work provides detailed estimates of the foetal dose from diagnostic CT imaging of pregnant patients to enable the assessment of the diagnostic benefits considering the associated radiation risks. To produce realistic biological and physical representations of pregnant patients and the embedded foetus, we developed a methodology for construction of patient-specific voxel-based computational phantoms based on existing standardised hybrid computational pregnant female phantoms. We estimated the maternal absorbed dose and foetal organ dose for 30 pregnant patients referred to the emergency unit of Geneva University Hospital for abdominal CT scans. The effective dose to the mother varied from 1.1 mSv to 2.0 mSv with an average of 1.6 mSv, while commercial dose-tracking software reported an average effective dose of 1.9 mSv (range 1.7-2.3 mSv). The foetal dose normalised to CTDI{sub vol} varies between 0.85 and 1.63 with an average of 1.17. The methodology for construction of personalised computational models can be exploited to estimate the patient-specific radiation dose from CT imaging procedures. Likewise, the dosimetric data can be used for assessment of the radiation risks to pregnant patients and the foetus from various CT scanning protocols, thus guiding the decision-making process. (orig.)

  11. Design and development of a diversified real time computer for future FBRs

    International Nuclear Information System (INIS)

    Sujith, K.R.; Bhattacharyya, Anindya; Behera, R.P.; Murali, N.

    2014-01-01

    The current safety related computer system of Prototype Fast Breeder Reactor (PFBR) under construction in Kalpakkam consists of two redundant Versa Module Europa (VME) bus based Real Time Computer system with a Switch Over Logic Circuit (SOLC). Since both the VME systems are identical, the dual redundant system is prone to common cause failure (CCF). The probability of CCF can be reduced by adopting diversity. Design diversity has long been used to protect redundant systems against common-mode failures. The conventional notion of diversity relies on 'independent' generation of 'different' implementations. This paper discusses the design and development of a diversified Real Time Computer which will replace one of the computer system in the dual redundant architecture. Compact PCI (cPCI) bus systems are widely used in safety critical applications such as avionics, railways, defence and uses diverse electrical signaling and logical specifications, hence was chosen for development of the diversified system. Towards the initial development a CPU card based on an ARM-9 processor, 16 channel Relay Output (RO) card and a 30 channel Analog Input (AI) card was developed. All the cards mentioned supports hot-swap and geographic addressing capability. In order to mitigate the component obsolescence problem the 32 bit PCI target controller and associated glue logic for the slave I/O cards was indigenously developed using VHDL. U-boot was selected as the boot loader and arm Linux 2.6 as the preliminary operating system for the CPU card. Board specific initialization code for the CPU card was written in ARM assembly language and serial port initialization was written in C language. Boot loader along with Linux 2.6 kernel and jffs2 file system was flashed into the CPU card. Test applications written in C language were used to test the various peripherals of the CPU card. Device driver for the AI and RO card was developed as Linux kernel modules and application library was also

  12. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  13. A robust computational solution for automated quantification of a specific binding ratio based on [123I]FP-CIT SPECT images

    International Nuclear Information System (INIS)

    Oliveira, F. P. M.; Tavares, J. M. R. S.; Borges, Faria D.; Campos, Costa D.

    2014-01-01

    The purpose of the current paper is to present a computational solution to accurately quantify a specific to a non-specific uptake ratio in [ 123 I]fP-CIT single photon emission computed tomography (SPECT) images and simultaneously measure the spatial dimensions of the basal ganglia, also known as basal nuclei. A statistical analysis based on a reference dataset selected by the user is also automatically performed. The quantification of the specific to non-specific uptake ratio here is based on regions of interest defined after the registration of the image under study with a template image. The computational solution was tested on a dataset of 38 [ 123 I]FP-CIT SPECT images: 28 images were from patients with Parkinson’s disease and the remainder from normal patients, and the results of the automated quantification were compared to the ones obtained by three well-known semi-automated quantification methods. The results revealed a high correlation coefficient between the developed automated method and the three semi-automated methods used for comparison (r ≥0.975). The solution also showed good robustness against different positions of the patient, as an almost perfect agreement between the specific to non-specific uptake ratio was found (ICC=1.000). The mean processing time was around 6 seconds per study using a common notebook PC. The solution developed can be useful for clinicians to evaluate [ 123 I]FP-CIT SPECT images due to its accuracy, robustness and speed. Also, the comparison between case studies and the follow-up of patients can be done more accurately and proficiently since the intra- and inter-observer variability of the semi-automated calculation does not exist in automated solutions. The dimensions of the basal ganglia and their automatic comparison with the values of the population selected as reference are also important for professionals in this area.

  14. Design and manufacturing of patient-specific orthodontic appliances by computer-aided engineering techniques.

    Science.gov (United States)

    Barone, Sandro; Neri, Paolo; Paoli, Alessandro; Razionale, Armando Viviano

    2018-01-01

    Orthodontic treatments are usually performed using fixed brackets or removable oral appliances, which are traditionally made from alginate impressions and wax registrations. Among removable devices, eruption guidance appliances are used for early orthodontic treatments in order to intercept and prevent malocclusion problems. Commercially available eruption guidance appliances, however, are symmetric devices produced using a few standard sizes. For this reason, they are not able to meet all the specific patient's needs since the actual dental anatomies present various geometries and asymmetric conditions. In this article, a computer-aided design-based methodology for the design and manufacturing of a patient-specific eruption guidance appliances is presented. The proposed approach is based on the digitalization of several steps of the overall process: from the digital reconstruction of patients' anatomies to the manufacturing of customized appliances. A finite element model has been developed to evaluate the temporomandibular joint disks stress level caused by using symmetric eruption guidance appliances with different teeth misalignment conditions. The developed model can then be used to guide the design of a patient-specific appliance with the aim at reducing the patient discomfort. At this purpose, two different customization levels are proposed in order to face both arches and single tooth misalignment issues. A low-cost manufacturing process, based on an additive manufacturing technique, is finally presented and discussed.

  15. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  16. Development of an organ-specific insert phantom generated using a 3D printer for investigations of cardiac computed tomography protocols.

    Science.gov (United States)

    Abdullah, Kamarul A; McEntee, Mark F; Reed, Warren; Kench, Peter L

    2018-04-30

    An ideal organ-specific insert phantom should be able to simulate the anatomical features with appropriate appearances in the resultant computed tomography (CT) images. This study investigated a 3D printing technology to develop a novel and cost-effective cardiac insert phantom derived from volumetric CT image datasets of anthropomorphic chest phantom. Cardiac insert volumes were segmented from CT image datasets, derived from an anthropomorphic chest phantom of Lungman N-01 (Kyoto Kagaku, Japan). These segmented datasets were converted to a virtual 3D-isosurface of heart-shaped shell, while two other removable inserts were included using computer-aided design (CAD) software program. This newly designed cardiac insert phantom was later printed by using a fused deposition modelling (FDM) process via a Creatbot DM Plus 3D printer. Then, several selected filling materials, such as contrast media, oil, water and jelly, were loaded into designated spaces in the 3D-printed phantom. The 3D-printed cardiac insert phantom was positioned within the anthropomorphic chest phantom and 30 repeated CT acquisitions performed using a multi-detector scanner at 120-kVp tube potential. Attenuation (Hounsfield Unit, HU) values were measured and compared to the image datasets of real-patient and Catphan ® 500 phantom. The output of the 3D-printed cardiac insert phantom was a solid acrylic plastic material, which was strong, light in weight and cost-effective. HU values of the filling materials were comparable to the image datasets of real-patient and Catphan ® 500 phantom. A novel and cost-effective cardiac insert phantom for anthropomorphic chest phantom was developed using volumetric CT image datasets with a 3D printer. Hence, this suggested the printing methodology could be applied to generate other phantoms for CT imaging studies. © 2018 The Authors. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical

  17. Wide-angle display developments by computer graphics

    Science.gov (United States)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  18. Computational design, construction, and characterization of a set of specificity determining residues in protein-protein interactions.

    Science.gov (United States)

    Nagao, Chioko; Izako, Nozomi; Soga, Shinji; Khan, Samia Haseeb; Kawabata, Shigeki; Shirai, Hiroki; Mizuguchi, Kenji

    2012-10-01

    Proteins interact with different partners to perform different functions and it is important to elucidate the determinants of partner specificity in protein complex formation. Although methods for detecting specificity determining positions have been developed previously, direct experimental evidence for these amino acid residues is scarce, and the lack of information has prevented further computational studies. In this article, we constructed a dataset that is likely to exhibit specificity in protein complex formation, based on available crystal structures and several intuitive ideas about interaction profiles and functional subclasses. We then defined a "structure-based specificity determining position (sbSDP)" as a set of equivalent residues in a protein family showing a large variation in their interaction energy with different partners. We investigated sequence and structural features of sbSDPs and demonstrated that their amino acid propensities significantly differed from those of other interacting residues and that the importance of many of these residues for determining specificity had been verified experimentally. Copyright © 2012 Wiley Periodicals, Inc.

  19. A way forward for the development of an exposure computational model to computed tomography dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, C.C., E-mail: cassio.c.ferreira@gmail.co [Nucleo de Fisica, Universidade Federal de Sergipe, Itabaiana-SE, CEP 49500-000 (Brazil); Galvao, L.A., E-mail: lailagalmeida@gmail.co [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil); Vieira, J.W., E-mail: jose.wilson59@uol.com.b [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife-PE, CEP 50740-540 (Brazil); Escola Politecnica de Pernambuco, Universidade de Pernambuco, Recife-PE, CEP 50720-001 (Brazil); Maia, A.F., E-mail: afmaia@ufs.b [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil)

    2011-04-15

    A way forward for the development of an exposure computational model to computed tomography dosimetry has been presented. In this way, an exposure computational model (ECM) for computed tomography (CT) dosimetry has been developed and validated through comparison with experimental results. For the development of the ECM, X-ray spectra generator codes have been evaluated and the head bow tie filter has been modelled through a mathematical equation. EGS4 and EGSnrc have been used for simulating the radiation transport by the ECM. Geometrical phantoms, commonly used in CT dosimetry, have been modelled by IDN software. MAX06 has also been used to simulate an adult male patient submitted for CT examinations. The evaluation of the X-ray spectra generator codes in CT dosimetry showed dependence with tube filtration (or HVL value). More generally, with the increment of total filtration (or HVL value) the X-raytbc becomes the best X-ray spectra generator code for CT dosimetry. The EGSnrc/X-raytbc combination has calculated C{sub 100,c} in better concordance with C{sub 100,c} measured in two different CT scanners. For a Toshiba CT scanner, the average percentage difference between the calculated C{sub 100,c} values and measured C{sub 100,c} values was 8.2%. Whilst for a GE CT scanner, the average percentage difference was 10.4%. By the measurements of air kerma through a prototype head bow tie filter a third-order exponential decay equation was found. C{sub 100,c} and C{sub 100,p} values calculated by the ECM are in good agreement with values measured at a specific CT scanner. A maximum percentage difference of 2% has been found in the PMMA CT head phantoms, demonstrating effective modelling of the head bow tie filter by the equation. The absorbed and effective doses calculated by the ECM developed in this work have been compared to those calculated by the ECM of Jones and Shrimpton for an adult male patient. For a head examination the absorbed dose values calculated by the

  20. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    Energy Technology Data Exchange (ETDEWEB)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  1. The Effectiveness of Computer-Assisted Instruction for Teaching Mathematics to Students with Specific Learning Disability

    Science.gov (United States)

    Stultz, Sherry L.

    2013-01-01

    Using computers to teach students is not a new idea. Computers have been utilized for educational purposes for over 80 years. However, the effectiveness of these programs for teaching mathematics to students with specific learning disability is unclear. This study was undertaken to determine if computer-assisted instruction was as effective as…

  2. Computational modeling of the mathematical phantoms of the Brazilian woman to internal dosimetry calculations and for comparison of the absorbed fractions with specific reference women

    International Nuclear Information System (INIS)

    Ximenes, Edmir; Guimaraes, Maria Ines C. C.

    2008-01-01

    The theme of this work is the study of the concept of mathematical dummy - also called phantoms - used in internal dosimetry and radiation protection, from the perspective of computer simulations. In this work he developed the mathematical phantom of the Brazilian woman, to be used as the basis of calculations of Specific Absorbed Fractions (AEDs) in the body's organs and skeleton by virtue of goals with regarding the diagnosis or therapy in nuclear medicine. The phantom now developed is similar, in form, to Snyder phantom making it more realistic for the anthropomorphic conditions of Brazilian women. For so we used the Monte Carlo method of formalism, through computer modeling. As a contribution to the objectives of this study, it was developed and implemented the computer system cFAE - consultation Fraction Specific Absorbed, which makes it versatile for the user's query researcher

  3. The level 1 and 2 specification for parallel benchmark and a benchmark test of scalar-parallel computer SP2 based on the specifications

    International Nuclear Information System (INIS)

    Orii, Shigeo

    1998-06-01

    A benchmark specification for performance evaluation of parallel computers for numerical analysis is proposed. Level 1 benchmark, which is a conventional type benchmark using processing time, measures performance of computers running a code. Level 2 benchmark proposed in this report is to give the reason of the performance. As an example, scalar-parallel computer SP2 is evaluated with this benchmark specification in case of a molecular dynamics code. As a result, the main causes to suppress the parallel performance are maximum band width and start-up time of communication between nodes. Especially the start-up time is proportional not only to the number of processors but also to the number of particles. (author)

  4. Development of algorithm for continuous generation of a computer game in terms of usability and optimization of developed code in computer science

    Directory of Open Access Journals (Sweden)

    Tibor Skala

    2018-03-01

    Full Text Available As both hardware and software have become increasingly available and constantly developed, they globally contribute to improvements in technology in every field of technology and arts. Digital tools for creation and processing of graphical contents are very developed and they have been designed to shorten the time required for content creation, which is, in this case, animation. Since contemporary animation has experienced a surge in various visual styles and visualization methods, programming is built-in in everything that is currently in use. There is no doubt that there is a variety of algorithms and software which are the brain and the moving force behind any idea created for a specific purpose and applicability in society. Art and technology combined make a direct and oriented medium for publishing and marketing in every industry, including those which are not necessarily closely related to those that rely heavily on visual aspect of work. Additionally, quality and consistency of an algorithm will also depend on proper integration into the system that will be powered by that algorithm as well as on the way the algorithm is designed. Development of an endless algorithm and its effective use will be shown during the use of the computer game. In order to present the effect of various parameters, in the final phase of the computer game development an endless algorithm was tested with varying number of key input parameters (achieved time, score reached, pace of the game.

  5. Web Program for Development of GUIs for Cluster Computers

    Science.gov (United States)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  6. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  7. Investigation of hemodynamics in the development of dissecting aneurysm within patient-specific dissecting aneurismal aortas using computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Tse, Kwong Ming; Chiu, Peixuan; Lee, Heow Pueh; Ho, Pei

    2011-03-15

    Aortic dissecting aneurysm is one of the most catastrophic cardiovascular emergencies that carries high mortality. It was pointed out from clinical observations that the aneurysm development is likely to be related to the hemodynamics condition of the dissected aorta. In order to gain more insight on the formation and progression of dissecting aneurysm, hemodynamic parameters including flow pattern, velocity distribution, aortic wall pressure and shear stress, which are difficult to measure in vivo, are evaluated using numerical simulations. Pulsatile blood flow in patient-specific dissecting aneurismal aortas before and after the formation of lumenal aneurysm (pre-aneurysm and post-aneurysm) is investigated by computational fluid dynamics (CFD) simulations. Realistic time-dependent boundary conditions are prescribed at various arteries of the complete aorta models. This study suggests the helical development of false lumen around true lumen may be related to the helical nature of hemodynamic flow in aorta. Narrowing of the aorta is responsible for the massive recirculation in the poststenosis region in the lumenal aneurysm development. High pressure difference of 0.21 kPa between true and false lumens in the pre-aneurismal aorta infers the possible lumenal aneurysm site in the descending aorta. It is also found that relatively high time-averaged wall shear stress (in the range of 4-8 kPa) may be associated with tear initiation and propagation. CFD modeling assists in medical planning by providing blood flow patterns, wall pressure and wall shear stress. This helps to understand various phenomena in the development of dissecting aneurysm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Risk-based technical specifications: Development and application of an approach to the generation of a plant specific real-time risk model

    International Nuclear Information System (INIS)

    Puglia, B.; Gallagher, D.; Amico, P.; Atefi, B.

    1992-10-01

    This report describes a process developed to convert an existing PRA into a model amenable to real time, risk-based technical specification calculations. In earlier studies (culminating in NUREG/CR-5742), several risk-based approaches to technical specification were evaluated. A real-time approach using a plant specific PRA capable of modeling plant configurations as they change was identified as the most comprehensive approach to control plant risk. A master fault tree logic model representative of-all of the core damage sequences was developed. Portions of the system fault trees were modularized and supercomponents comprised of component failures with similar effects were developed to reduce the size of the model and, quantification times. Modifications to the master fault tree logic were made to properly model the effect of maintenance and recovery actions. Fault trees representing several actuation systems not modeled in detail in the existing PRA were added to the master fault tree logic. This process was applied to the Surry NUREG-1150 Level 1 PRA. The master logic mode was confirmed. The model was then used to evaluate frequency associated with several plant configurations using the IRRAS code. For all cases analyzed computational time was less than three minutes. This document Volume 2, contains appendices A, B, and C. These provide, respectively: Surry Technical Specifications Model Database, Surry Technical Specifications Model, and a list of supercomponents used in the Surry Technical Specifications Model

  9. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  10. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  11. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  12. Development of x-ray computed tomographic scanner for iron and steel

    International Nuclear Information System (INIS)

    Taguchi, Isamu; Nakamura, Shigeo.

    1985-01-01

    X-ray computed tomography is extensively used in medicine, but has rarely been applied to non-medical purposes. Steel specimens pose particularly difficult problems-very poor transmission of X-rays and the need for high resolving capability. There has thus been no effective tomographic method of examining steel specimens. Due to the growing need for non-destructive, non-contact methods for observing and analyzing the internal conditions of steel microscopically, however, we have developed an X-ray Computed Tomographic Scanner for Steel (CTS) system, specifically for examination of steel specimens. Its major specifications and functions are as follows. Type: the second-generation CT, 8-channels, Scanning method: 6 0 revolution, 30-times traversing, Slice width: 0.5 mm, Resolving capability: 0.25 x 0.25 mm, X-ray source: 420 kV, 3 mA, X-ray detector: BGO scintillator, Standard specimen shape: 50 mm dia., 100 mm high, Measuring time: 10.5 min. Porosity of a stainless steel (SUS 304) bloom was examined three-dimensionally by the CTS system. Corrosion procedure of a steel slab was also examined. (author)

  13. Frequency of educational computer use as a longitudinal predictor of educational outcome in young people with specific language impairment.

    Directory of Open Access Journals (Sweden)

    Kevin Durkin

    Full Text Available Computer use draws on linguistic abilities. Using this medium thus presents challenges for young people with Specific Language Impairment (SLI and raises questions of whether computer-based tasks are appropriate for them. We consider theoretical arguments predicting impaired performance and negative outcomes relative to peers without SLI versus the possibility of positive gains. We examine the relationship between frequency of computer use (for leisure and educational purposes and educational achievement; in particular examination performance at the end of compulsory education and level of educational progress two years later. Participants were 49 young people with SLI and 56 typically developing (TD young people. At around age 17, the two groups did not differ in frequency of educational computer use or leisure computer use. There were no associations between computer use and educational outcomes in the TD group. In the SLI group, after PIQ was controlled for, educational computer use at around 17 years of age contributed substantially to the prediction of educational progress at 19 years. The findings suggest that educational uses of computers are conducive to educational progress in young people with SLI.

  14. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    Energy Technology Data Exchange (ETDEWEB)

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  15. Computer Graphics for Multimedia and Hypermedia Development.

    Science.gov (United States)

    Mohler, James L.

    1998-01-01

    Discusses several theoretical and technical aspects of computer-graphics development that are useful for creating hypermedia and multimedia materials. Topics addressed include primary bitmap attributes in computer graphics, the jigsaw principle, and raster layering. (MSE)

  16. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  17. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  18. Report on evaluation of research and development of superhigh-function electronic computers; Chokoseino denshi keisanki no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-02-20

    Described herein is development of superhigh-function electronic computers.This project was implemented on a 6-year joint project, beginning in FY 1966, by the government, industrial and academic circles, with the objective to develop standard, large-size computers comparable with those of the world's highest functions by the beginning of the 70's. The computers developed by this project met almost all of the specifications of the world's representative, large-size commercial computers, partly surpassing the world's machine. In particular, integration of the virtual memory, buffer memory and multi-processor functions, which were considered to be the central technical features of the computers of the next generation, into one system was a Japan's unique concept, not seen in other countries. The other developments considered to have great ripple effects are seen in LSI's, and techniques for utilizing and mounting them and for improving their reliability. Development of magnetic discs is another notable result for the peripheral devices. Development of the input/output devices was started to correspond to inputting, outputting and reading Chinese characters, which are characteristics of Japan. The software developed has sufficient functions for common use and is considered to be the world's leading, large-size operating system, although evaluation thereof largely awaits the actual specification results. (NEDO)

  19. Development of Student Information Management System based on Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    Ibrahim A. ALAMERI

    2017-10-01

    Full Text Available The management and provision of information about the educational process is an essential part of effective management of the educational process in the institutes of higher education. In this paper the requirements of a reliable student management system are analyzed, formed a use-case model of student information management system, designed and implemented the architecture of the application. Regarding the implementation process, modern approaches were used to develop and deploy a reliable online application in cloud computing environments specifically.

  20. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  1. Dopamine Receptor-Specific Contributions to the Computation of Value.

    Science.gov (United States)

    Burke, Christopher J; Soutschek, Alexander; Weber, Susanna; Raja Beharelle, Anjali; Fehr, Ernst; Haker, Helene; Tobler, Philippe N

    2018-05-01

    Dopamine is thought to play a crucial role in value-based decision making. However, the specific contributions of different dopamine receptor subtypes to the computation of subjective value remain unknown. Here we demonstrate how the balance between D1 and D2 dopamine receptor subtypes shapes subjective value computation during risky decision making. We administered the D2 receptor antagonist amisulpride or placebo before participants made choices between risky options. Compared with placebo, D2 receptor blockade resulted in more frequent choice of higher risk and higher expected value options. Using a novel model fitting procedure, we concurrently estimated the three parameters that define individual risk attitude according to an influential theoretical account of risky decision making (prospect theory). This analysis revealed that the observed reduction in risk aversion under amisulpride was driven by increased sensitivity to reward magnitude and decreased distortion of outcome probability, resulting in more linear value coding. Our data suggest that different components that govern individual risk attitude are under dopaminergic control, such that D2 receptor blockade facilitates risk taking and expected value processing.

  2. Starpc: a library for communication among tools on a parallel computer cluster. User's and developer's guide to Starpc

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro

    2000-02-01

    We report on a RPC(Remote Procedure Call)-based communication library, Starpc, for a parallel computer cluster. Starpc supports communication between Java Applets and C programs as well as between C programs. Starpc has the following three features. (1) It enables communication between Java Applets and C programs on an arbitrary computer without security violation, although Java Applets are supposed to communicate only with programs on the specific computer (Web server) in subject to a restriction on security. (2) Diverse network communication protocols are available on Starpc, because of using Nexus communication library developed at Argonne National Laboratory. (3) It works on many kinds of computers including eight parallel computers and four WS servers. In this report, the usage of Starpc and the development of applications using Starpc are described. (author)

  3. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  4. Development of emission computed tomography in Japan

    International Nuclear Information System (INIS)

    Tanaka, E.

    1984-01-01

    Two positron emission computed tomography (PCT) devices developed in Japan are described. One is for head and the other for wholebody. The devices show fairly quantitative images with slight modifications of the existing algorithms because they were developed based on filtered back-projection. The PCT device seems to be better than the single photon emission computed tomography (SPECT) since it provides adequade compensation for photon attenuation in patients. (M.A.C.) [pt

  5. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    Science.gov (United States)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  6. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  7. Computer-Assisted Mathematics Instruction for Students with Specific Learning Disability: A Review of the Literature

    Science.gov (United States)

    Stultz, Sherry L.

    2017-01-01

    This review was conducted to evaluate the current body of scholarly research regarding the use of computer-assisted instruction (CAI) to teach mathematics to students with specific learning disability (SLD). For many years, computers are utilized for educational purposes. However, the effectiveness of CAI for teaching mathematics to this specific…

  8. Accounting valuation development of specific assets

    Directory of Open Access Journals (Sweden)

    I.V. Zhigley

    2017-12-01

    Full Text Available The current issues of accounting estimate development are considered. The necessity of the development of accounting estimate in the context of the non-institutional theory principles based on the selection of a number of reasons is grounded. The reasons for deterioration of accounting reputation as a separate socio-economic institute in the context of developing the methodology for specific assets accounting are discovered. The system of normative regulation of accounting estimate of enterprise non-current assets in the case of diminishing their usefulness is analyzed. The procedure for determining and accounting for the depreciation of assets in accordance with IFRS 36 «Depreciation of Assets» is developed. The features of the joint use of the concept of «value in use» and «fair value» in the accounting system are disclosed. The procedure for determining the value of compensation depending on the degree of specificity of assets is developed. The necessity to clarify the features that indicate the possibility of diminishing the usefulness of specific assets (termination or pre-term termination of the contract for the use of a specific asset is grounded.

  9. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    Science.gov (United States)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  10. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  11. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  12. Development of an educational partnership for enhancement of a computer risk assessment model

    International Nuclear Information System (INIS)

    Topper, K.

    1995-02-01

    The Multimedia Environmental Pollutant Assessment System (MEPAS) is a computer program which evaluates exposure pathways for chemical and radioactive releases according to their potential human health impacts. MEPAS simulates the exposure pathways through standard source-to-receptor transport principles using, a multimedia approach (air, groundwater, overland flow, soil, surface water) in conjunction with specific chemical exposure considerations. This model was originally developed by Pacific Northwest Laboratory (PNL) to prioritize environmental concerns at potentially contaminated US Department of Energy (DOE) sites. Currently MEPAS is being used to evaluate a range of environmental problems which are not restricted to DOE sites. A partnership was developed between PNL and Mesa State College during 1991. This partnership involves the use of undergraduate students, faculty, and PNL personnel to complete enhancements to MEPAS. This has led to major refinements to the original MEPAS shell for DOE in a very cost-effective manner. PNL was awarded a 1993 Federal Laboratory Consortium Award and Mesa State College was awarded an Environmental Restoration and Waste Management Distinguished Faculty Award from DOE in 1993 as a result of this collaboration. The college has benefited through the use of MEPAS within laboratories and through the applied experience gained by the students. Development of this partnership will be presented with the goal of allowing other DOE facilities to replicate this program. It is specifically recommended that DOE establish funded programs which support this type of a relationship on an ongoing basis. Additionally, specific enhancements to MEPAS will be presented through computer display of the program

  13. Development and validation of Monte Carlo dose computations for contrast-enhanced stereotactic synchrotron radiation therapy

    International Nuclear Information System (INIS)

    Vautrin, M.

    2011-01-01

    Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr

  14. Age- and sex-specific thorax finite element model development and simulation.

    Science.gov (United States)

    Schoell, Samantha L; Weaver, Ashley A; Vavalle, Nicholas A; Stitzel, Joel D

    2015-01-01

    The shape, size, bone density, and cortical thickness of the thoracic skeleton vary significantly with age and sex, which can affect the injury tolerance, especially in at-risk populations such as the elderly. Computational modeling has emerged as a powerful and versatile tool to assess injury risk. However, current computational models only represent certain ages and sexes in the population. The purpose of this study was to morph an existing finite element (FE) model of the thorax to depict thorax morphology for males and females of ages 30 and 70 years old (YO) and to investigate the effect on injury risk. Age- and sex-specific FE models were developed using thin-plate spline interpolation. In order to execute the thin-plate spline interpolation, homologous landmarks on the reference, target, and FE model are required. An image segmentation and registration algorithm was used to collect homologous rib and sternum landmark data from males and females aged 0-100 years. The Generalized Procrustes Analysis was applied to the homologous landmark data to quantify age- and sex-specific isolated shape changes in the thorax. The Global Human Body Models Consortium (GHBMC) 50th percentile male occupant model was morphed to create age- and sex-specific thoracic shape change models (scaled to a 50th percentile male size). To evaluate the thoracic response, 2 loading cases (frontal hub impact and lateral impact) were simulated to assess the importance of geometric and material property changes with age and sex. Due to the geometric and material property changes with age and sex, there were observed differences in the response of the thorax in both the frontal and lateral impacts. Material property changes alone had little to no effect on the maximum thoracic force or the maximum percent compression. With age, the thorax becomes stiffer due to superior rotation of the ribs, which can result in increased bone strain that can increase the risk of fracture. For the 70-YO models

  15. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  16. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  17. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  18. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  19. Development and numerical analysis of low specific speed mixed-flow pump

    International Nuclear Information System (INIS)

    Li, H F; Huo, Y W; Pan, Z B; Zhou, W C; He, M H

    2012-01-01

    With the development of the city, the market of the mixed flow pump with large flux and high head is prospect. The KSB Shanghai Pump Co., LTD decided to develop low speed specific speed mixed flow pump to meet the market requirements. Based on the centrifugal pump and axial flow pump model, aiming at the characteristics of large flux and high head, a new type of guide vane mixed flow pump was designed. The computational fluid dynamics method was adopted to analyze the internal flow of the new type model and predict its performances. The time-averaged Navier-Stokes equations were closed by SST k-ω turbulent model to adapt internal flow of guide vane with larger curvatures. The multi-reference frame(MRF) method was used to deal with the coupling of rotating impeller and static guide vane, and the SIMPLEC method was adopted to achieve the coupling solution of velocity and pressure. The computational results shows that there is great flow impact on the head of vanes at different working conditions, and there is great flow separation at the tailing of the guide vanes at different working conditions, and all will affect the performance of pump. Based on the computational results, optimizations were carried out to decrease the impact on the head of vanes and flow separation at the tailing of the guide vanes. The optimized model was simulated and its performance was predicted. The computational results show that the impact on the head of vanes and the separation at the tailing of the guide vanes disappeared. The high efficiency of the optimized pump is wide, and it fit the original design destination. The newly designed mixed flow pump is now in modeling and its experimental performance will be getting soon.

  20. Development and numerical analysis of low specific speed mixed-flow pump

    Science.gov (United States)

    Li, H. F.; Huo, Y. W.; Pan, Z. B.; Zhou, W. C.; He, M. H.

    2012-11-01

    With the development of the city, the market of the mixed flow pump with large flux and high head is prospect. The KSB Shanghai Pump Co., LTD decided to develop low speed specific speed mixed flow pump to meet the market requirements. Based on the centrifugal pump and axial flow pump model, aiming at the characteristics of large flux and high head, a new type of guide vane mixed flow pump was designed. The computational fluid dynamics method was adopted to analyze the internal flow of the new type model and predict its performances. The time-averaged Navier-Stokes equations were closed by SST k-ω turbulent model to adapt internal flow of guide vane with larger curvatures. The multi-reference frame(MRF) method was used to deal with the coupling of rotating impeller and static guide vane, and the SIMPLEC method was adopted to achieve the coupling solution of velocity and pressure. The computational results shows that there is great flow impact on the head of vanes at different working conditions, and there is great flow separation at the tailing of the guide vanes at different working conditions, and all will affect the performance of pump. Based on the computational results, optimizations were carried out to decrease the impact on the head of vanes and flow separation at the tailing of the guide vanes. The optimized model was simulated and its performance was predicted. The computational results show that the impact on the head of vanes and the separation at the tailing of the guide vanes disappeared. The high efficiency of the optimized pump is wide, and it fit the original design destination. The newly designed mixed flow pump is now in modeling and its experimental performance will be getting soon.

  1. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  2. Computer-aided preparation of specifications for radial fans at VEB Lufttechnische Anlagen Berlin

    Energy Technology Data Exchange (ETDEWEB)

    Kubis, R.; Kull, W.

    1987-01-01

    The specification details the scope of delivery for radial fans on a standard page and also serves the preparation for production. In the place of previous manual preparation, a computer-aided technique for the office computer is presented that provides the technical parameters from data files out of few input data to identify the fan type. The data files and evaluative programs are based on the software tool REDABAS and the SCP operating system. Using this technique it has been possible to cut considerably the preparation time for the incoming orders.

  3. Exploring gender differences on general and specific computer self-efficacy in mobile learning adoption

    OpenAIRE

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi; Kibelloh, Mboni

    2014-01-01

    Reasons for contradictory findings regarding the gender moderate effect on computer self-efficacy in the adoption of e-learning/mobile learning are limited. Recognizing the multilevel nature of the computer self-efficacy (CSE), this study attempts to explore gender differences in the adoption of mobile learning, by extending the Technology Acceptance Model (TAM) with general and specific CSE. Data collected from 137 university students were tested against the research model using the structur...

  4. Development Of A Navier-Stokes Computer Code

    Science.gov (United States)

    Yoon, Seokkwan; Kwak, Dochan

    1993-01-01

    Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.

  5. Flow stagnation volume and abdominal aortic aneurysm growth: Insights from patient-specific computational flow dynamics of Lagrangian-coherent structures.

    Science.gov (United States)

    Joly, Florian; Soulez, Gilles; Garcia, Damien; Lessard, Simon; Kauffmann, Claude

    2018-01-01

    Abdominal aortic aneurysms (AAA) are localized, commonly-occurring dilations of the aorta. When equilibrium between blood pressure (loading) and wall mechanical resistance is lost, rupture ensues, and patient death follows, if not treated immediately. Experimental and numerical analyses of flow patterns in arteries show direct correlations between wall shear stress and wall mechano-adaptation with the development of zones prone to thrombus formation. For further insights into AAA flow topology/growth interaction, a workout of patient-specific computational flow dynamics (CFD) is proposed to compute finite-time Lyapunov exponents and extract Lagrangian-coherent structures (LCS). This computational model was first compared with 4-D phase-contrast magnetic resonance imaging (MRI) in 5 patients. To better understand the impact of flow topology and transport on AAA growth, hyperbolic, repelling LCS were computed in 1 patient during 8-year follow-up, including 9 volumetric morphologic AAA measures by computed tomography-angiography (CTA). LCS defined barriers to Lagrangian jet cores entering AAA. Domains enclosed between LCS and the aortic wall were considered to be stagnation zones. Their evolution was studied during AAA growth. Good correlation - 2-D cross-correlation coefficients of 0.65, 0.86 and 0.082 (min, max, SD) - was obtained between numerical simulations and 4-D MRI acquisitions in 6 specific cross-sections from 4 patients. In follow-up study, LCS divided AAA lumens into 3 dynamically-isolated zones: 2 stagnation volumes lying in dilated portions of the AAA, and circulating volume connecting the inlet to the outlet. The volume of each zone was tracked over time. Although circulating volume remained unchanged during 8-year follow-up, the AAA lumen and main stagnation zones grew significantly (8 cm 3 /year and 6 cm 3 /year, respectively). This study reveals that transient transport topology can be quantified in patient-specific AAA during disease progression

  6. A Brief Analysis of Development Situations and Trend of Cloud Computing

    Science.gov (United States)

    Yang, Wenyan

    2017-12-01

    in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.

  7. Southampton uni's computer whizzes develop "mini" grid

    CERN Multimedia

    Sherriff, Lucy

    2006-01-01

    "In a bid to help its students explore the potential of grid computing, the University of Southampton's Computer Science department has developed what it calls a "lightweight grid". The system has been designed to allow students to experiment with grid technology without the complexity of inherent security concerns of the real thing. (1 page)

  8. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  9. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    International Nuclear Information System (INIS)

    SCAIEF, C.C.

    1999-01-01

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring

  10. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Affordances of the 'branch and bound' paradigm for developing computational thinking

    NARCIS (Netherlands)

    van der Meulen, Joris; Timmer, Mark

    As technological advances in engineering and computer science happen more and more quickly, we must shift focus from teaching specific techniques or programming languages to teaching something more transcending: computational thinking (Wing, 2006). Wing explained this concept later as “the thought

  12. Hierarchy, determinism, and specificity in theories of development and evolution.

    Science.gov (United States)

    Deichmann, Ute

    2017-10-16

    The concepts of hierarchical organization, genetic determinism and biological specificity (for example of species, biologically relevant macromolecules, or genes) have played a crucial role in biology as a modern experimental science since its beginnings in the nineteenth century. The idea of genetic information (specificity) and genetic determination was at the basis of molecular biology that developed in the 1940s with macromolecules, viruses and prokaryotes as major objects of research often labelled "reductionist". However, the concepts have been marginalized or rejected in some of the research that in the late 1960s began to focus additionally on the molecularization of complex biological structures and functions using systems approaches. This paper challenges the view that 'molecular reductionism' has been successfully replaced by holism and a focus on the collective behaviour of cellular entities. It argues instead that there are more fertile replacements for molecular 'reductionism', in which genomics, embryology, biochemistry, and computer science intertwine and result in research that is as exact and causally predictive as earlier molecular biology.

  13. Development of a computer design system for HVAC

    International Nuclear Information System (INIS)

    Miyazaki, Y.; Yotsuya, M.; Hasegawa, M.

    1993-01-01

    The development of a computer design system for HVAC (Heating, Ventilating and Air Conditioning) system is presented in this paper. It supports the air conditioning design for a nuclear power plant and a reprocessing plant. This system integrates various computer design systems which were developed separately for the various design phases of HVAC. the purposes include centralizing the HVAC data, optimizing design, and reducing the designing time. The centralized HVAC data are managed by a DBMS (Data Base Management System). The DBMS separates the computer design system into a calculation module and the data. The design system can thus be expanded easily in the future. 2 figs

  14. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  15. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    International Nuclear Information System (INIS)

    Byamukama, Abdul; Jung, Haiyong

    2014-01-01

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries

  16. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Byamukama, Abdul [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Haiyong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries.

  17. BrEPS: a flexible and automatic protocol to compute enzyme-specific sequence profiles for functional annotation

    Directory of Open Access Journals (Sweden)

    Schomburg D

    2010-12-01

    Full Text Available Abstract Background Models for the simulation of metabolic networks require the accurate prediction of enzyme function. Based on a genomic sequence, enzymatic functions of gene products are today mainly predicted by sequence database searching and operon analysis. Other methods can support these techniques: We have developed an automatic method "BrEPS" that creates highly specific sequence patterns for the functional annotation of enzymes. Results The enzymes in the UniprotKB are identified and their sequences compared against each other with BLAST. The enzymes are then clustered into a number of trees, where each tree node is associated with a set of EC-numbers. The enzyme sequences in the tree nodes are aligned with ClustalW. The conserved columns of the resulting multiple alignments are used to construct sequence patterns. In the last step, we verify the quality of the patterns by computing their specificity. Patterns with low specificity are omitted and recomputed further down in the tree. The final high-quality patterns can be used for functional annotation. We ran our protocol on a recent Swiss-Prot release and show statistics, as well as a comparison to PRIAM, a probabilistic method that is also specialized on the functional annotation of enzymes. We determine the amount of true positive annotations for five common microorganisms with data from BRENDA and AMENDA serving as standard of truth. BrEPS is almost on par with PRIAM, a fact which we discuss in the context of five manually investigated cases. Conclusions Our protocol computes highly specific sequence patterns that can be used to support the functional annotation of enzymes. The main advantages of our method are that it is automatic and unsupervised, and quite fast once the patterns are evaluated. The results show that BrEPS can be a valuable addition to the reconstruction of metabolic networks.

  18. Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Chen Jiun-Ching

    2007-05-01

    Full Text Available Abstract Background Genome-wide identification of specific oligonucleotides (oligos is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos. Results We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes. Conclusion The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through

  19. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident

    International Nuclear Information System (INIS)

    Vazquez, Justin A; Caracappa, Peter F; Xu, X George

    2014-01-01

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools. (paper)

  20. Development of computer systems for planning and management of reactor decommissioning

    International Nuclear Information System (INIS)

    Yanagihara, Satoshi; Sukegawa, Takenori; Shiraishi, Kunio

    2001-01-01

    The computer systems for planning and management of reactor decommissioning were developed for effective implementation of a decommissioning project. The systems are intended to be applied to construction of work breakdown structures and estimation of manpower needs, worker doses, etc. based on the unit productivity and work difficulty factors, which were developed by analyzing the actual data on the JPDR dismantling activities. In addition, information necessary for project planning can be effectively integrated as a graphical form on a computer screen by transferring the data produced by subprograms such as radioactive inventory and dose rate calculation routines among the systems. Expert systems were adopted for modeling a new decommissioning project using production rules by reconstructing work breakdown structures and work specifications. As the results, the systems were characterized by effective modeling of a decommissioning project, project management data estimation based on feedback of past experience, and information integration through the graphical user interface. On the other hands, the systems were validated by comparing the calculated results with the actual manpower needs of the JPDR dismantling activities; it is expected that the systems will be applicable to planning and evaluation of other decommissioning projects. (author)

  1. Reactor safety computer code development at INEL

    International Nuclear Information System (INIS)

    Johnsen, G.W.

    1985-01-01

    This report provides a brief overview of the computer code development programs being conducted at EG and G Idaho, Inc. on behalf of US Nuclear Regulatory Commission and the Department of Energy, Idaho Operations Office. Included are descriptions of the codes being developed, their development status as of the date of this report, and resident code development expertise

  2. Developments in Remote Collaboration and Computation

    International Nuclear Information System (INIS)

    Burruss, J.R.; Abla, G.; Flanagan, S.; Keahey, K.; Leggett, T.; Ludesche, C.; McCune, D.; Papka, M.E.; Peng, Q.; Randerson, L.; Schissel, D.P.

    2005-01-01

    The National Fusion Collaboratory (NFC) is creating and deploying collaborative software tools to unite magnetic fusion research in the United States. In particular, the NFC is developing and deploying a national FES 'Grid' (FusionGrid) for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid is to allow scientists at remote sites to participate as fully in experiments, machine design, and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community

  3. An Open Computing Infrastructure that Facilitates Integrated Product and Process Development from a Decision-Based Perspective

    Science.gov (United States)

    Hale, Mark A.

    1996-01-01

    Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust

  4. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  5. Computer Aided Design System for Developing Musical Fountain Programs

    Institute of Scientific and Technical Information of China (English)

    刘丹; 张乃尧; 朱汉城

    2003-01-01

    A computer aided design system for developing musical fountain programs was developed with multiple functions such as intelligent design, 3-D animation, manual modification and synchronized motion to make the development process more efficient. The system first analyzed the music form and sentiment using many basic features of the music to select a basic fountain program. Then, this program is simulated with 3-D animation and modified manually to achieve the desired results. Finally, the program is transformed to a computer control program to control the musical fountain in time with the music. A prototype system for the musical fountain was also developed. It was tested with many styles of music and users were quite satisfied with its performance. By integrating various functions, the proposed computer aided design system for developing musical fountain programs greatly simplified the design of the musical fountain programs.

  6. Development and validation of GWHEAD, a three-dimensional groundwater head computer code

    International Nuclear Information System (INIS)

    Beckmeyer, R.R.; Root, R.W.; Routt, K.R.

    1980-03-01

    A computer code has been developed to solve the groundwater flow equation in three dimensions. The code has finite-difference approximations solved by the strongly implicit solution procedure. Input parameters to the code include hydraulic conductivity, specific storage, porosity, accretion (recharge), and initial hydralic head. These parameters may be input as varying spatially. The hydraulic conductivity may be input as isotropic or anisotropic. The boundaries either may permit flow across them or may be impermeable. The code has been used to model leaky confined groundwater conditions and spherical flow to a continuous point sink, both of which have exact analytical solutions. The results generated by the computer code compare well with those of the analytical solutions. The code was designed to be used to model groundwater flow beneath fuel reprocessing and waste storage areas at the Savannah River Plant

  7. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    Science.gov (United States)

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  8. Development of an On-Line Surgeon-Specific Operating Room Time Prediction System (Experience with the Michigan Surgical Monitors)

    OpenAIRE

    Brown, Allan C.D.; Schmidt, Nancy M.

    1984-01-01

    The development of a micro-computer application for the on-line prediction of surgeon-specific operating room time using an IBM - PCXT is described. The reasons leading to the project, together with an assessment of the Condor 20 relational database management system as the basis for the application are discussed.

  9. Tissue-type-specific transcriptome analysis identifies developing xylem-specific promoters in poplar.

    Science.gov (United States)

    Ko, Jae-Heung; Kim, Hyun-Tae; Hwang, Ildoo; Han, Kyung-Hwan

    2012-06-01

    Plant biotechnology offers a means to create novel phenotypes. However, commercial application of biotechnology in crop improvement programmes is severely hindered by the lack of utility promoters (or freedom to operate the existing ones) that can drive gene expression in a tissue-specific or temporally controlled manner. Woody biomass is gaining popularity as a source of fermentable sugars for liquid fuel production. To improve the quantity and quality of woody biomass, developing xylem (DX)-specific modification of the feedstock is highly desirable. To develop utility promoters that can drive transgene expression in a DX-specific manner, we used the Affymetrix Poplar Genome Arrays to obtain tissue-type-specific transcriptomes from poplar stems. Subsequent bioinformatics analysis identified 37 transcripts that are specifically or strongly expressed in DX cells of poplar. After further confirmation of their DX-specific expression using semi-quantitative PCR, we selected four genes (DX5, DX8, DX11 and DX15) for in vivo confirmation of their tissue-specific expression in transgenic poplars. The promoter regions of the selected DX genes were isolated and fused to a β-glucuronidase (GUS)-reported gene in a binary vector. This construct was used to produce transgenic poplars via Agrobacterium-mediated transformation. The GUS expression patterns of the resulting transgenic plants showed that these promoters were active in the xylem cells at early seedling growth and had strongest expression in the developing xylem cells at later growth stages of poplar. We conclude that these DX promoters can be used as a utility promoter for DX-specific biomass engineering. © 2012 The Authors. Plant Biotechnology Journal © 2012 Society for Experimental Biology, Association of Applied Biologists and Blackwell Publishing Ltd.

  10. Correction of computed tomography motion artifacts using pixel-specific back-projection

    International Nuclear Information System (INIS)

    Ritchie, C.J.; Crawford, C.R.; Godwin, J.D.; Kim, Y. King, K.F.

    1996-01-01

    Cardiac and respiratory motion can cause artifacts in computed tomography scans of the chest. The authors describe a new method for reducing these artifacts called pixel-specific back-projection (PSBP). PSBP reduces artifacts caused by in-plane motion by reconstructing each pixel in a frame of reference that moves with the in-plane motion in the volume being scanned. The motion of the frame of reference is specified by constructing maps that describe the motion of each pixel in the image at the time each projection was measured; these maps are based on measurements of the in-plane motion. PSBP has been tested in computer simulations and with volunteer data. In computer simulations, PSBP removed the structured artifacts caused by motion. In scans of two volunteers, PSBP reduced doubling and streaking in chest scans to a level that made the images clinically useful. PSBP corrections of liver scans were less satisfactory because the motion of the liver is predominantly superior-inferior (S-I). PSBP uses a unique set of motion parameters to describe the motion at each point in the chest as opposed to requiring that the motion be described by a single set of parameters. Therefore, PSBP may be more useful in correcting clinical scans than are other correction techniques previously described

  11. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  12. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  13. STEEP4 code for computation of specific thermonuclear reaction rates from pointwise cross sections

    International Nuclear Information System (INIS)

    Harris, D.R.; Dei, D.E.; Husseiny, A.A.; Sabri, Z.A.; Hale, G.M.

    1976-05-01

    A code module, STEEP4, is developed to calculate the fusion reaction rates in terms of the specific reactivity [sigma v] which is the product of cross section and relative velocity averaged over the actual ion distributions of the interacting particles in the plasma. The module is structured in a way suitable for incorporation in thermonuclear burn codes to provide rapid and yet relatively accurate on-line computation of [sigma v] as a function of plasma parameters. Ion distributions are modified to include slowing-down contributions which are characterized in terms of plasma parameters. Rapid and accurate algorithms are used for integrating [sigma v] from cross sections and spectra. The main program solves for [sigma v] by the method of steepest descent. However, options are provided to use Gauss-Hermite and dense trapezoidal quadrature integration techniques. Options are also provided for rapid calculation of screening effects on specific reaction rates. Although such effects are not significant in cases of plasmas of laboratory interest, the options are included to increase the range of applicability of the code. Gamow penetration form, log-log interpolation, and cubic interpolation routines are included to provide the interpolated values of cross sections

  14. Development of a fast running accident analysis computer program for use in a simulator

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    1985-01-01

    This paper describes how a reactor safety nuclear computer program can be modified and improved with the aim of reaching a very fast running tool to be used as a physical model in a plant simulator, without penalizing the accuracy of results. It also discusses some ideas on how the physical theoretical model can be combined to a driving statistical tool for the build up of the entire package of software to be implemented in the simulator for risk and reliability analysis. The approach to the problem, although applied to a specific computer program, can be considered quite general if an already existing and well tested code is being used for the purpose. The computer program considered is ALMOD, originally developed for the analysis of the thermohydraulic and neutronic behaviour of the reactor core, primary circuit and steam generator during operational and special transients. (author)

  15. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  16. Development of age-specific Japanese physical phantoms for dose evaluation in infant CT examinations

    International Nuclear Information System (INIS)

    Yamauchi-Kawaura, C.; Fujii, K.; Imai, K.; Ikeda, M.; Akahane, K.; Obara, S.; Yamauchi, M.; Narai, K.; Katsu, T.

    2016-01-01

    Secondary to the previous development of age-specific Japanese head phantoms, the authors designed Japanese torso phantoms for dose assessment in infant computed tomography (CT) examinations and completed a Japanese 3-y-old head-torso phantom. For design of age-specific torso phantoms (0, 0.5, 1 and 3 y old), anatomical structures were measured from CT images of Japanese infant patients. From the CT morphometry, it was found that rib cages of Japanese infants were smaller than those in Europeans and Americans. Radiophotoluminescence glass dosemeters were used for dose measurement of a 3-y-old head-torso phantom. To examine the validity of the developed phantom, organ and effective doses by the in-phantom dosimetry system were compared with simulation values in a web-based CT dose calculation system (WAZA-ARI). The differences in doses between the two systems were <20 % at the doses of organs within scan regions and effective doses in head, chest and abdomino-pelvic CT examinations. (authors)

  17. Shlaer-Mellor object-oriented analysis and recursive design, an effective modern software development method for development of computing systems for a large physics detector

    International Nuclear Information System (INIS)

    Kozlowski, T.; Carey, T.A.; Maguire, C.F.

    1995-01-01

    After evaluation of several modern object-oriented methods for development of the computing systems for the PHENIX detector at RHIC, we selected the Shlaer-Mellor Object-Oriented Analysis and Recursive Design method as the most appropriate for the needs and development environment of a large nuclear or high energy physics detector. This paper discusses our specific needs and environment, our method selection criteria, and major features and components of the Shlaer-Mellor method

  18. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  19. A Computational Methodology to Overcome the Challenges Associated With the Search for Specific Enzyme Targets to Develop Drugs Against Leishmania major.

    Science.gov (United States)

    Catharina, Larissa; Lima, Carlyle Ribeiro; Franca, Alexander; Guimarães, Ana Carolina Ramos; Alves-Ferreira, Marcelo; Tuffery, Pierre; Derreumaux, Philippe; Carels, Nicolas

    2017-01-01

    We present an approach for detecting enzymes that are specific of Leishmania major compared with Homo sapiens and provide targets that may assist research in drug development. This approach is based on traditional techniques of sequence homology comparison by similarity search and Markov modeling; it integrates the characterization of enzymatic functionality, secondary and tertiary protein structures, protein domain architecture, and metabolic environment. From 67 enzymes represented by 42 enzymatic activities classified by AnEnPi (Analogous Enzymes Pipeline) as specific for L major compared with H sapiens , only 40 (23 Enzyme Commission [EC] numbers) could actually be considered as strictly specific of L major and 27 enzymes (19 EC numbers) were disregarded for having ambiguous homologies or analogies with H sapiens . Among the 40 strictly specific enzymes, we identified sterol 24-C-methyltransferase, pyruvate phosphate dikinase, trypanothione synthetase, and RNA-editing ligase as 4 essential enzymes for L major that may serve as targets for drug development.

  20. Disease-Specific Care: Spine Surgery Program Development.

    Science.gov (United States)

    Koerner, Katie; Franker, Lauren; Douglas, Barbara; Medero, Edgardo; Bromeland, Jennifer

    2017-10-01

    Minimal literature exists describing the process for development of a Joint Commission comprehensive spine surgery program within a community hospital health system. Components of a comprehensive program include structured communication across care settings, preoperative education, quality outcomes tracking, and patient follow-up. Organizations obtaining disease-specific certification must have clear knowledge of the planning, time, and overall commitment, essential to developing a successful program. Health systems benefit from disease-specific certification because of their commitment to a higher standard of service. Certification standards establish a framework for organizational structure and management and provide institutions a competitive edge in the marketplace. A framework for the development of a spine surgery program is described to help guide organizations seeking disease-specific certification. In developing a comprehensive program, it is critical to define the program's mission and vision, identify key stakeholders, implement clinical practice guidelines, and evaluate program outcomes.

  1. Development of a voxel phantom specific for simulation of eye brachytherapy

    International Nuclear Information System (INIS)

    Santos, Marcilio S.; Lima, Fernando R.A.

    2013-01-01

    The ophthalmic brachytherapy involves inserting a plate with seeds of radioactive material in the patient's eye for the treatment of tumors. The radiation dose to be taken by the patient is prescribed by physicians and time of application of the material is calculated from calibration curves supplied by the manufacturers of the plates. To estimate the dose absorbed by the patient, in a series of diagnostic tests, it is necessary to perform simulations using a computational model of exposure. These models are composed primarily by a anthropomorphic phantom, and a Monte Carlo code. The coupling of a phantom voxel whole body to a Monte Carlo code is a complex process because the computer model simulations with exposure takes time, knowledge of the code used and various adjustments to be implemented. The problem is aggravated even more complex when you want to radiate one region of the body. In this work we developed a phantom, specifically the region containing the eyeball, from MASH (Male Adult voxel). This model was coupled to the Monte Carlo code EGSnrc (Electron Gamma Shower) together with an algorithm simulator source of I-125 , considering only its effect of higher energy range

  2. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  3. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  4. Iteration and Prototyping in Creating Technical Specifications.

    Science.gov (United States)

    Flynt, John P.

    1994-01-01

    Claims that the development process for computer software can be greatly aided by the writers of specifications if they employ basic iteration and prototyping techniques. Asserts that computer software configuration management practices provide ready models for iteration and prototyping. (HB)

  5. Development of the computer network of IFIN-HH

    International Nuclear Information System (INIS)

    Danet, A.; Mirica, M.; Constantinescu, S.

    1998-01-01

    The general computer network of Horia Hulubei National Institute for Physics and Nuclear Engineering (IFIN-HH), as part of RNC (Romanian National Computer Network for scientific research and technological development), offers the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNC is the national project co-ordinated and established by the Ministry of Research and Technology targeted on the following main objectives: - setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - providing a rapid and competitive tool for the exchange information in the framework of R-D community; - using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - providing a support for information, documentation, scientific and technical co-operation. The guiding principle in elaborating the project of general computer network of IFIN-HH was to implement an open system based on OSI standards without technical barriers in communication between different communities using different computing hardware and software. The major objectives achieved in 1997 in the direction of developing the general computer network of IFIN-HH (over 250 computers connected) were: - connecting all the existing and newly installed computer equipment and providing an adequate connectivity; - providing the usual Internet services: e-mail, ftp, telnet, finger, gopher; - providing access to the World Wide Web resources; - providing on-line statistics of IP traffic (input and output) of each node of the domain computer network; - improving the performance of the connection with the central node RNC. (authors)

  6. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  7. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  8. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  9. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  10. Comparison of computed tomography based parametric and patient-specific finite element models of the healthy and metastatic spine using a mesh-morphing algorithm.

    Science.gov (United States)

    O'Reilly, Meaghan Anne; Whyne, Cari Marisa

    2008-08-01

    A comparative analysis of parametric and patient-specific finite element (FE) modeling of spinal motion segments. To develop patient-specific FE models of spinal motion segments using mesh-morphing methods applied to a parametric FE model. To compare strain and displacement patterns in parametric and morphed models for both healthy and metastatically involved vertebrae. Parametric FE models may be limited in their ability to fully represent patient-specific geometries and material property distributions. Generation of multiple patient-specific FE models has been limited because of computational expense. Morphing methods have been successfully used to generate multiple specimen-specific FE models of caudal rat vertebrae. FE models of a healthy and a metastatic T6-T8 spinal motion segment were analyzed with and without patient-specific material properties. Parametric and morphed models were compared using a landmark-based morphing algorithm. Morphing of the parametric FE model and including patient-specific material properties both had a strong impact on magnitudes and patterns of vertebral strain and displacement. Small but important geometric differences can be represented through morphing of parametric FE models. The mesh-morphing algorithm developed provides a rapid method for generating patient-specific FE models of spinal motion segments.

  11. Development of a 3-dimensional seismic isolation floor for computer systems

    International Nuclear Information System (INIS)

    Kurihara, M.; Shigeta, M.; Nino, T.; Matsuki, T.

    1991-01-01

    In this paper, we investigated the applicability of a seismic isolation floor as a method for protecting computer systems from strong earthquakes, such as computer systems in nuclear power plants. Assuming that the computer system is guaranteed for 250 cm/s 2 of input acceleration in the horizontal and vertical directions as the seismic performance, the basic design specification of the seismic isolation floor is considered as follows. Against S 1 level earthquakes, the maximum acceleration response of the seismic isolation floor in the horizontal and vertical directions is kept less than 250 cm/s 2 to maintain continuous computer operation. Against S 2 level earthquakes, the isolation floor allows large horizontal movement and large displacement of the isolation devices to reduce the acceleration response, although it is not guaranteed to be less than 250 cm/s 2 . By reducing the acceleration response, however, serious damage to the computer systems is reduced, so that they can be restarted after an earthquake. Usually, seismic isolation floor systems permit 2-dimensional (horizontal) isolation. However, in the case of just-under-seated earthquakes, which have large vertical components, the vertical acceleration response of this system is amplified by the lateral vibration of the frame of the isolation floor. Therefore, in this study a 3-dimensional seismic isolation floor, including vertical isolation, was developed. This paper describes 1) the experimental results of the response characteristics of the 3-dimensional seismic isolation floor built as a trial using a 3-dimensional shaking table, and 2) comparison of a 2-dimensional analytical model, for motion in one horizontal direction and the vertical direction, to experimental results. (J.P.N.)

  12. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Soojin Park

    2015-04-01

    Full Text Available Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo significant remodeling. This study analyzes actual cases of SaaS cloud computing environment adoption as a way to derive four new best practices for software development and incorporates the identified best practices for currently-in-use processes. Furthermore, this study presents a design for generic software development processes that implement the proposed best practices. The design for the generic process has been applied to reinforce the weak points found in SaaS cloud service development practices used by eight enterprises currently developing or operating actual SaaS cloud computing services. Lastly, this study evaluates the applicability of the proposed SaaS cloud oriented development process through analyzing the feedback data collected from actual application to the development of a SaaS cloud service Astation.

  13. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    Science.gov (United States)

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  14. Cloud Computing: Key to IT Development in West Africa | Nwabuonu ...

    African Journals Online (AJOL)

    It has been established that Information Technology (IT) Development in West Africa has faced lots of challenges ranging from Cyber Threat to inadequate IT Infrastructure. Cloud Computing is a Revolution. It is creating a fundamental change in Computer Architecture, Software and Tools Development iIn the way we Store, ...

  15. Development of the radiosynthesis of high-specific-activity 123I-NKJ64

    International Nuclear Information System (INIS)

    Tavares, Adriana Alexandre S.; Jobson, Nicola K.; Dewar, Deborah; Sutherland, Andrew; Pimlott, Sally L.

    2011-01-01

    Introduction: 123 I-NKJ64, a reboxetine analogue, is currently under development as a potential novel single photon emission computed tomography radiotracer for imaging the noradrenaline transporter in brain. This study describes the development of the radiosynthesis of 123 I-NKJ64, highlighting the advantages and disadvantages, pitfalls and solutions encountered while developing the final radiolabelling methodology. Methods: The synthesis of 123 I-NKJ64 was evaluated using an electrophilic iododestannylation method, where a Boc-protected trimethylstannyl precursor was radioiodinated using peracetic acid as an oxidant and deprotection was investigated using either trifluoroacetic acid (TFA) or 2 M hydrochloric acid (HCl). Results: Radioiodination of the Boc-protected trimethylstannyl precursor was achieved with an incorporation yield of 92±6%. Deprotection with 2 M HCl produced 123 I-NKJ64 with the highest radiochemical yield of 98.05±1.63% compared with 83.95±13.24% with TFA. However, the specific activity of the obtained 123 I-NKJ64 was lower when measured after using 2 M HCl (0.15±0.23 Ci/μmol) as the deprotecting agent in comparison to TFA (1.76±0.60 Ci/μmol). Further investigation of the 2 M HCl methodology found a by-product, identified as the deprotected proto-destannylated precursor, which co-eluted with 123 I-NKJ64 during the high-performance liquid chromatography (HPLC) purification. Conclusions: The radiosynthesis of 123 I-NKJ64 was achieved with good isolated radiochemical yield of 68% and a high specific activity of 1.8 Ci/μmol. TFA was found to be the most suitable deprotecting agent, since 2 M HCl generated a by-product that could not be fully separated from 123 I-NKJ64 using the HPLC methodology investigated. This study highlights the importance of HPLC purification and accurate measurement of specific activity while developing new radiosynthesis methodologies.

  16. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  17. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  18. Development of computer aided engineering system for TRAC applications

    International Nuclear Information System (INIS)

    Arai, Kenji; Itoya, Seihiro; Uematsu, Hitoshi; Tsunoyama, Shigeaki

    1990-01-01

    An advanced best estimate computer program for nuclear reactor transient analysis, TRAC has been extensively used to carry out various thermal hydraulic calculations in the nuclear engineering field, because of its versatility. To perform efficiently a wide variety of TRAC calculation, the efficient utilization of computers and the convenient environment for input and output processing is necessary. We have applied a computer network comprising a super-computer, engineering work stations and personal computers to TRAC calculations and have assigned the appropriate functions to each computer. We have also been developing an interactive graphics system for input and output processing on an EWS. This hardware and software environment can improve the effectiveness of TRAC utilization for various thermal hydraulic calculations. (author)

  19. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  20. Development of computer-aided auto-ranging technique for a computed radiography system

    International Nuclear Information System (INIS)

    Ishida, M.; Shimura, K.; Nakajima, N.; Kato, H.

    1988-01-01

    For a computed radiography system, the authors developed a computer-aided autoranging technique in which the clinically useful image data are automatically mapped to the available display range. The preread image data are inspected to determine the location of collimation. A histogram of the pixels inside the collimation is evaluated regarding characteristic values such as maxima and minima, and then the optimal density and contrast are derived for the display image. The effect of the autoranging technique was investigated at several hospitals in Japan. The average rate of films lost due to undesirable density or contrast was about 0.5%

  1. SPECIFICITY IN DEVELOPMENT OF CONSTRUCTION INDUSTRY

    Directory of Open Access Journals (Sweden)

    O. S. Golubova

    2012-01-01

    Full Text Available Specificity in development of construction industry of the Republic of Belarus determines  character of competition on the construction market, forms a pricing, marketing and product policy of building companies. Construction represents itself as a highly developed complex where interaction of business entities is of rather complicated multilateral character.

  2. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  3. Formal Specification and Analysis of Cloud Computing Management

    Science.gov (United States)

    2012-01-24

    te r Cloud Computing in a Nutshell We begin this introduction to Cloud Computing with a famous quote by Larry Ellison: “The interesting thing about...the wording of some of our ads.” — Larry Ellison, Oracle CEO [106] In view of this statement, we summarize the essential aspects of Cloud Computing...1] M. Abadi, M. Burrows , M. Manasse, and T. Wobber. Moderately hard, memory-bound functions. ACM Transactions on Internet Technology, 5(2):299–327

  4. Present status of computational tools for maglev development

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  5. Developing and validating an instrument for measuring mobile computing self-efficacy.

    Science.gov (United States)

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  6. The effect of inlet waveforms on computational hemodynamics of patient-specific intracranial aneurysms.

    Science.gov (United States)

    Xiang, J; Siddiqui, A H; Meng, H

    2014-12-18

    Due to the lack of patient-specific inlet flow waveform measurements, most computational fluid dynamics (CFD) simulations of intracranial aneurysms usually employ waveforms that are not patient-specific as inlet boundary conditions for the computational model. The current study examined how this assumption affects the predicted hemodynamics in patient-specific aneurysm geometries. We examined wall shear stress (WSS) and oscillatory shear index (OSI), the two most widely studied hemodynamic quantities that have been shown to predict aneurysm rupture, as well as maximal WSS (MWSS), energy loss (EL) and pressure loss coefficient (PLc). Sixteen pulsatile CFD simulations were carried out on four typical saccular aneurysms using 4 different waveforms and an identical inflow rate as inlet boundary conditions. Our results demonstrated that under the same mean inflow rate, different waveforms produced almost identical WSS distributions and WSS magnitudes, similar OSI distributions but drastically different OSI magnitudes. The OSI magnitude is correlated with the pulsatility index of the waveform. Furthermore, there is a linear relationship between aneurysm-averaged OSI values calculated from one waveform and those calculated from another waveform. In addition, different waveforms produced similar MWSS, EL and PLc in each aneurysm. In conclusion, inlet waveform has minimal effects on WSS, OSI distribution, MWSS, EL and PLc and a strong effect on OSI magnitude, but aneurysm-averaged OSI from different waveforms has a strong linear correlation with each other across different aneurysms, indicating that for the same aneurysm cohort, different waveforms can consistently stratify (rank) OSI of aneurysms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Development of the voxel computational phantoms of pediatric patients and their application to organ dose assessment

    Science.gov (United States)

    Lee, Choonik

    A series of realistic voxel computational phantoms of pediatric patients were developed and then used for the radiation risk assessment for various exposure scenarios. The high-resolution computed tomographic images of live patients were utilized for the development of the five voxel phantoms of pediatric patients, 9-month male, 4-year female, 8-year female, 11-year male, and 14-year male. The phantoms were first developed as head and torso phantoms and then extended into whole body phantoms by utilizing computed tomographic images of a healthy adult volunteer. The whole body phantom series was modified to have the same anthropometrics with the most recent reference data reported by the international commission on radiological protection. The phantoms, named as the University of Florida series B, are the first complete set of the pediatric voxel phantoms having reference organ masses and total heights. As part of the dosimetry study, the investigation on skeletal tissue dosimetry methods was performed for better understanding of the radiation dose to the active bone marrow and bone endosteum. All of the currently available methodologies were inter-compared and benchmarked with the paired-image radiation transport model. The dosimetric characteristics of the phantoms were investigated by using Monte Carlo simulation of the broad parallel beams of external phantom in anterior-posterior, posterior-anterior, left lateral, right lateral, rotational, and isotropic angles. Organ dose conversion coefficients were calculated for extensive photon energies and compared with the conventional stylized pediatric phantoms of Oak Ridge National Laboratory. The multi-slice helical computed tomography exams were simulated using Monte Carlo simulation code for various exams protocols, head, chest, abdomen, pelvis, and chest-abdomen-pelvis studies. Results have found realistic estimates of the effective doses for frequently used protocols in pediatric radiology. The results were very

  8. Computer Training for Entrepreneurial Meteorologists.

    Science.gov (United States)

    Koval, Joseph P.; Young, George S.

    2001-05-01

    Computer applications of increasing diversity form a growing part of the undergraduate education of meteorologists in the early twenty-first century. The advent of the Internet economy, as well as a waning demand for traditional forecasters brought about by better numerical models and statistical forecasting techniques has greatly increased the need for operational and commercial meteorologists to acquire computer skills beyond the traditional techniques of numerical analysis and applied statistics. Specifically, students with the skills to develop data distribution products are in high demand in the private sector job market. Meeting these demands requires greater breadth, depth, and efficiency in computer instruction. The authors suggest that computer instruction for undergraduate meteorologists should include three key elements: a data distribution focus, emphasis on the techniques required to learn computer programming on an as-needed basis, and a project orientation to promote management skills and support student morale. In an exploration of this approach, the authors have reinvented the Applications of Computers to Meteorology course in the Department of Meteorology at The Pennsylvania State University to teach computer programming within the framework of an Internet product development cycle. Because the computer skills required for data distribution programming change rapidly, specific languages are valuable for only a limited time. A key goal of this course was therefore to help students learn how to retrain efficiently as technologies evolve. The crux of the course was a semester-long project during which students developed an Internet data distribution product. As project management skills are also important in the job market, the course teamed students in groups of four for this product development project. The success, failures, and lessons learned from this experiment are discussed and conclusions drawn concerning undergraduate instructional methods

  9. Experimental and computational development of a natural breast phantom for dosimetry studies

    International Nuclear Information System (INIS)

    Nogueira, Luciana B.; Campos, Tarcisio P.R.

    2013-01-01

    This paper describes the experimental and computational development of a natural breast phantom, anthropomorphic and anthropometric for studies in dosimetry of brachytherapy and teletherapy of breast. The natural breast phantom developed corresponding to fibroadipose breasts of women aged 30 to 50 years, presenting radiographically medium density. The experimental breast phantom was constituted of three tissue-equivalents (TE's): glandular TE, adipose TE and skin TE. These TE's were developed according to chemical composition of human breast and present radiological response to exposure. Completed the construction of experimental breast phantom this was mounted on a thorax phantom previously developed by the research group NRI/UFMG. Then the computational breast phantom was constructed by performing a computed tomography (CT) by axial slices of the chest phantom. Through the images generated by CT a computational model of voxels of the thorax phantom was developed by SISCODES computational program, being the computational breast phantom represented by the same TE's of the experimental breast phantom. The images generated by CT allowed evaluating the radiological equivalence of the tissues. The breast phantom is being used in studies of experimental dosimetry both in brachytherapy as in teletherapy of breast. Dosimetry studies by MCNP-5 code using the computational model of the phantom breast are in progress. (author)

  10. Development of an improved approach to radiation treatment therapy using high-definition patient-specific voxel phantoms

    International Nuclear Information System (INIS)

    Ward, R.C.; Ryman, J.C.; Worley, B.A.; Stallings, D.C.

    1998-01-01

    Through an internally funded project at Oak Ridge National Laboratory, a high-resolution phantom was developed based on the National Library of Medicine's Visible Human Data. Special software was written using the interactive data language (IDL) visualization language to automatically segment and classify some of the organs and the skeleton of the Visible Male. A high definition phantom consisting of nine hundred 512 x 512 slices was constructed of the entire torso. Computed tomography (CT) images of a patient's tumor near the spine were scaled and morphed into the phantom model to create a patient-specific phantom. Calculations of dose to the tumor and surrounding tissue were then performed using the patient-specific phantom

  11. Development and implementation of a low-cost phantom for quality control in cone beam computed tomography

    International Nuclear Information System (INIS)

    Batista, W. O.; Navarro, M. V. T.; Maia, A. F.

    2013-01-01

    A phantom for quality control in cone beam computed tomography (CBCT) scanners was designed and constructed, and a methodology for testing was developed. The phantom had a polymethyl methacrylate structure filled with water and plastic objects that allowed the assessment of parameters related to quality control. The phantom allowed the evaluation of essential parameters in CBCT as well as the evaluation of linear and angular dimensions. The plastics used in the phantom were chosen so that their density and linear attenuation coefficient were similar to those of human facial structures. Three types of CBCT equipment, with two different technological concepts, were evaluated. The results of the assessment of the accuracy of linear and angular dimensions agreed with the existing standards. However, other parameters such as computed tomography number accuracy, uniformity and high-contrast detail did not meet the tolerances established in current regulations or the manufacturer's specifications. The results demonstrate the importance of establishing specific protocols and phantoms, which meet the specificities of CBCT. The practicality of implementation, the quality control test results for the proposed phantom and the consistency of the results using different equipment demonstrate its adequacy. (authors)

  12. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions.

    Science.gov (United States)

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600-700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In

  13. Development of a UNIX network compatible reactivity computer

    International Nuclear Information System (INIS)

    Sanchez, R.F.; Edwards, R.M.

    1996-01-01

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably

  14. Development of computed tomography system and image reconstruction algorithm

    International Nuclear Information System (INIS)

    Khairiah Yazid; Mohd Ashhar Khalid; Azaman Ahmad; Khairul Anuar Mohd Salleh; Ab Razak Hamzah

    2006-01-01

    Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 μm pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)

  15. Beyond computer literacy: supporting youth's positive development through technology.

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.

  16. A reliable and valid questionnaire was developed to measure computer vision syndrome at the workplace.

    Science.gov (United States)

    Seguí, María del Mar; Cabrero-García, Julio; Crespo, Ana; Verdú, José; Ronda, Elena

    2015-06-01

    To design and validate a questionnaire to measure visual symptoms related to exposure to computers in the workplace. Our computer vision syndrome questionnaire (CVS-Q) was based on a literature review and validated through discussion with experts and performance of a pretest, pilot test, and retest. Content validity was evaluated by occupational health, optometry, and ophthalmology experts. Rasch analysis was used in the psychometric evaluation of the questionnaire. Criterion validity was determined by calculating the sensitivity and specificity, receiver operator characteristic curve, and cutoff point. Test-retest repeatability was tested using the intraclass correlation coefficient (ICC) and concordance by Cohen's kappa (κ). The CVS-Q was developed with wide consensus among experts and was well accepted by the target group. It assesses the frequency and intensity of 16 symptoms using a single rating scale (symptom severity) that fits the Rasch rating scale model well. The questionnaire has sensitivity and specificity over 70% and achieved good test-retest repeatability both for the scores obtained [ICC = 0.802; 95% confidence interval (CI): 0.673, 0.884] and CVS classification (κ = 0.612; 95% CI: 0.384, 0.839). The CVS-Q has acceptable psychometric properties, making it a valid and reliable tool to control the visual health of computer workers, and can potentially be used in clinical trials and outcome research. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  18. Development of industrial variant specification systems

    DEFF Research Database (Denmark)

    Hansen, Benjamin Loer

    be developed from a holistic and strategically anchored point of view. Another assumption is that this is a challenge for many industrial companies. Even though the literature presents many considerations on general issues covering new information technology, little work is found on the business perspectives...... are discussed. A list of structural variables and solution components has been created. These are related to four design aspects in the holistic system design covering the aspects of process design, selection of resources (such as hardware, software and humans), the design of information structures...... solution elements and structural variables to be used in the design of variant specification systems. The thesis presents a “top-down” procedure to be used to develop variant specification systems from a strategically anchored and holistic point of view. A methodology and related task variables...

  19. NWChem Meeting on Science Driven Petascale Computing and Capability Development at EMSL

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.

    2007-02-19

    On January 25, and 26, 2007, an NWChem meeting was held that was attended by 65 scientists from 29 institutions including 22 universities and 5 national laboratories. The goals of the meeting were to look at major scientific challenges that could be addressed by computational modeling in environmental molecular sciences, and to identify the associated capability development needs. In addition, insights were sought into petascale computing developments in computational chemistry. During the meeting common themes were identified that will drive the need for the development of new or improved capabilities in NWChem. Crucial areas of development that the developer's team will be focusing on are (1) modeling of dynamics and kinetics in chemical transformations, (2) modeling of chemistry at interfaces and in the condensed phase, and (3) spanning longer time scales in biological processes modeled with molecular dynamics. Various computational chemistry methodologies were discussed during the meeting, which will provide the basis for the capability developments in the near or long term future of NWChem.

  20. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  1. Computer-aided system design

    Science.gov (United States)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  2. Craniofacial reconstruction using patient-specific implants polyether ether ketone with computer-assisted planning.

    Science.gov (United States)

    Manrique, Oscar J; Lalezarzadeh, Frank; Dayan, Erez; Shin, Joseph; Buchbinder, Daniel; Smith, Mark

    2015-05-01

    Reconstruction of bony craniofacial defects requires precise understanding of the anatomic relationships. The ideal reconstructive technique should be fast as well as economical, with minimal donor-site morbidity, and provide a lasting and aesthetically pleasing result. There are some circumstances in which a patient's own tissue is not sufficient to reconstruct defects. The development of sophisticated software has facilitated the manufacturing of patient-specific implants (PSIs). The aim of this study was to analyze the utility of polyether ether ketone (PEEK) PSIs for craniofacial reconstruction. We performed a retrospective chart review from July 2009 to July 2013 in patients who underwent craniofacial reconstruction using PEEK-PSIs using a virtual process based on computer-aided design and computer-aided manufacturing. A total of 6 patients were identified. The mean age was 46 years (16-68 y). Operative indications included cancer (n = 4), congenital deformities (n = 1), and infection (n = 1). The mean surgical time was 3.7 hours and the mean hospital stay was 1.5 days. The mean surface area of the defect was 93.4 ± 43.26 cm(2), the mean implant cost was $8493 ± $837.95, and the mean time required to manufacture the implants was 2 weeks. No major or minor complications were seen during the 4-year follow-up. We found PEEK implants to be useful in the reconstruction of complex calvarial defects, demonstrating a low complication rate, good outcomes, and high patient satisfaction in this small series of patients. Polyether ether ketone implants show promising potential and warrant further study to better establish the role of this technology in cranial reconstruction.

  3. Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?

    Directory of Open Access Journals (Sweden)

    Markus A Wenzel

    Full Text Available Brain-computer interfaces (BCIs that are based on event-related potentials (ERPs can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG. Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI, because it would allow software to adapt to the user's interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli.Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions.Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG.The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.

  4. ATLAS computing activities and developments in the Italian Grid cloud

    International Nuclear Information System (INIS)

    Rinaldi, L; Ciocca, C; K, M; Annovi, A; Antonelli, M; Martini, A; Barberis, D; Brunengo, A; Corosu, M; Barberis, S; Carminati, L; Campana, S; Di, A; Capone, V; Carlino, G; Doria, A; Esposito, R; Merola, L; De, A; Luminari, L

    2012-01-01

    The large amount of data produced by the ATLAS experiment needs new computing paradigms for data processing and analysis, which involve many computing centres spread around the world. The computing workload is managed by regional federations, called “clouds”. The Italian cloud consists of a main (Tier-1) center, located in Bologna, four secondary (Tier-2) centers, and a few smaller (Tier-3) sites. In this contribution we describe the Italian cloud facilities and the activities of data processing, analysis, simulation and software development performed within the cloud, and we discuss the tests of the new computing technologies contributing to evolution of the ATLAS Computing Model.

  5. An assessment of future computer system needs for large-scale computation

    Science.gov (United States)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  6. DEVELOPMENT OF COMPUTER AIDED DESIGN OF CHAIN COUPLING

    Directory of Open Access Journals (Sweden)

    Sergey Aleksandrovich Sergeev

    2015-12-01

    Full Text Available The present paper describes the development stages of computer-aided design of chain couplings. The first stage is the automation of traditional design techniques (intermediate automation. The second integrated automation with the development of automated equipment and production technology, including on the basis of flexible manufacturing systems (high level of automation.

  7. Computational structural mechanics for engine structures

    Science.gov (United States)

    Chamis, C. C.

    1989-01-01

    The computational structural mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures. It is structured to mainly supplement, complement, and whenever possible replace, costly experimental efforts which are unavoidable during engineering research and development programs. Specific objectives include: investigate unique advantages of parallel and multiprocesses for: reformulating/solving structural mechanics and formulating/solving multidisciplinary mechanics and develop integrated structural system computational simulators for: predicting structural performances, evaluating newly developed methods, and for identifying and prioritizing improved/missing methods needed. Herein the CSM program is summarized with emphasis on the Engine Structures Computational Simulator (ESCS). Typical results obtained using ESCS are described to illustrate its versatility.

  8. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  9. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  10. Computing for Lattice QCD: new developments from the APE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R [INFN, Sezione di Roma Tor Vergata, Roma (Italy); Biagioni, A; De Luca, S [INFN, Sezione di Roma, Roma (Italy)

    2008-06-15

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  11. Computing for Lattice QCD: new developments from the APE experiment

    International Nuclear Information System (INIS)

    Ammendola, R.; Biagioni, A.; De Luca, S.

    2008-01-01

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  12. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  13. Development of system of computer codes for severe accident analysis and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H S; Jeon, M H; Cho, N J. and others [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1992-01-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts.

  14. Development of system of computer codes for severe accident analysis and its applications

    International Nuclear Information System (INIS)

    Jang, H. S.; Jeon, M. H.; Cho, N. J. and others

    1992-01-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts

  15. Preliminary application of computer-assisted patient-specific acetabular navigational template for total hip arthroplasty in adult single development dysplasia of the hip.

    Science.gov (United States)

    Zhang, Yuan Z; Chen, Bin; Lu, Sheng; Yang, Yong; Zhao, Jian M; Liu, Rui; Li, Yan B; Pei, Guo X

    2011-12-01

    The considerable variation in anatomical abnormalities of hip joints associated with different types of developmental dysplasia of hip (DDH) makes reconstruction in total hip arthroplasty (THA) difficult. It is desirable to create patient-specific designs for THA procedures. In the cases of adult single DDH, an accuracy-improved method has been developed for acetabular cup prosthesis implantation of hip arthroplasty. From October 2007 to November 2008, 22 patients with single DDH (according to the Crowe standard, all dysplasia hips were classified as type I) were scanned with spiral CT pre-operatively. These patients scheduled for THA were randomly assigned to undergo either conventional THA (control group, n = 11) or navigation template implantation (NT group, n = 11). In the NT group, three-dimensional (3D) CT pelvis image data were transferred to a computer workstation and 3D models of the hip were reconstructed using the Mimics software. The 3D models were then processed by the Imageware software. In brief, a template that best fitted the location and shape of the acetabular cup was 'reversely' built from the 3D model, the rotation centre of the pathological hip determined by mirroring that of the healthy site, and a guiding hole in the template was then designed. The navigational templates were manufactured using a rapid prototyping machine. These navigation templates guide acetabular component placement. Based on the predetermined abduction angle 45° and anteversion angle 18°, after 1 year follow-up, the NT group showed significantly smaller differences (1.6° ± 0.4°, 1.9° ± 1.1°) from the predetermined angles than those in the control group (5.8° ± 2.9°, 3.9° ± 2.5°) (P < 0.05). The template designs facilitated accurate placement of acetabular components in dysplasia of acetabulum. The hip's center of rotation in DDH could be established using computer-aided design, which provides a useful method for the accurate

  16. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  17. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  18. Neural Network Optimization of Ligament Stiffnesses for the Enhanced Predictive Ability of a Patient-Specific, Computational Foot/Ankle Model.

    Science.gov (United States)

    Chande, Ruchi D; Wayne, Jennifer S

    2017-09-01

    Computational models of diarthrodial joints serve to inform the biomechanical function of these structures, and as such, must be supplied appropriate inputs for performance that is representative of actual joint function. Inputs for these models are sourced from both imaging modalities as well as literature. The latter is often the source of mechanical properties for soft tissues, like ligament stiffnesses; however, such data are not always available for all the soft tissues nor is it known for patient-specific work. In the current research, a method to improve the ligament stiffness definition for a computational foot/ankle model was sought with the greater goal of improving the predictive ability of the computational model. Specifically, the stiffness values were optimized using artificial neural networks (ANNs); both feedforward and radial basis function networks (RBFNs) were considered. Optimal networks of each type were determined and subsequently used to predict stiffnesses for the foot/ankle model. Ultimately, the predicted stiffnesses were considered reasonable and resulted in enhanced performance of the computational model, suggesting that artificial neural networks can be used to optimize stiffness inputs.

  19. Recent Developments in Computed Tomography

    International Nuclear Information System (INIS)

    Braunstein, D.; Dafni, E.; Levene, S.; Malamud, G.; Shapiro, O.; Shechter, G.; Zahavi, O.

    1999-01-01

    Computerized Tomography. has become, during the past few years, one of the mostly used apparatus in X-ray diagnosis. Its clinical applications has penetrated to various fields, like operational guidance, cardiac imaging, computer aided surgery etc. The first second-generation CT scanners consisted of a rotate-rotate system detectors array and an X-ray tube. These scanners were capable of acquiring individual single slices, the duration of each being several seconds. The slow scanning rate, and the then poor computers power, limited the application range of these scanners, to relatively stable organs, short body coverage at given resolutions. Further drawbacks of these machines were weak X-ray sources and low efficiency gas detectors. In the late 80's the first helical scanners were introduced by Siemens. Based on a continuous patient couch movement during gantry rotation, much faster scans could be obtained, increasing significantly the volume coverage at a given time. In 1992 the first dual-slice scanners, equipped with high efficiency solid state detectors were introduced by Elscint. The acquisition of data simultaneously from two detector arrays doubled the efficiency of the scan. Faster computers and stronger X-ray sources further improved the performance, allowing for a new range of clinical applications. Yet, the need for even faster machines and bigger volume coverage led to further R and D efforts by the leading CT manufacturers. In order to accomplish the most demanding clinical needs, innovative 2 dimensional 4-rows solid-state detector arrays were developed, together with faster rotating machines and bigger X-ray tubes, all demanding extremely accurate and robust mechanical constructions. Parallel, multi-processor custom computers were made, in order to allow the on-line reconstruction of the growing amounts of raw data. Four-slice helical scanners, rotating at 0.5 sec per cycle are being tested nowadays in several clinics all over the world. This talk

  20. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  1. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  2. A computer literacy scale for newly enrolled nursing college students: development and validation.

    Science.gov (United States)

    Lin, Tung-Cheng

    2011-12-01

    Increasing application and use of information systems and mobile technologies in the healthcare industry require increasing nurse competency in computer use. Computer literacy is defined as basic computer skills, whereas computer competency is defined as the computer skills necessary to accomplish job tasks. Inadequate attention has been paid to computer literacy and computer competency scale validity. This study developed a computer literacy scale with good reliability and validity and investigated the current computer literacy of newly enrolled students to develop computer courses appropriate to students' skill levels and needs. This study referenced Hinkin's process to develop a computer literacy scale. Participants were newly enrolled first-year undergraduate students, with nursing or nursing-related backgrounds, currently attending a course entitled Information Literacy and Internet Applications. Researchers examined reliability and validity using confirmatory factor analysis. The final version of the developed computer literacy scale included six constructs (software, hardware, multimedia, networks, information ethics, and information security) and 22 measurement items. Confirmatory factor analysis showed that the scale possessed good content validity, reliability, convergent validity, and discriminant validity. This study also found that participants earned the highest scores for the network domain and the lowest score for the hardware domain. With increasing use of information technology applications, courses related to hardware topic should be increased to improve nurse problem-solving abilities. This study recommends that emphases on word processing and network-related topics may be reduced in favor of an increased emphasis on database, statistical software, hospital information systems, and information ethics.

  3. Development of Graphical Solution for Computer-Assisted Fault Diagnosis: Preliminary Study

    International Nuclear Information System (INIS)

    Yoon, Han Bean; Yun, Seung Man; Han, Jong Chul

    2009-01-01

    We have developed software for converting the volumetric voxel data obtained from X-ray computed tomography(CT) into computer-aided design(CAD) data. The developed software can used for non-destructive testing and evaluation, reverse engineering, and rapid prototyping, etc. The main algorithms employed in the software are image reconstruction, volume rendering, segmentation, and mesh data generation. The feasibility of the developed software is demonstrated with the CT data of human maxilla and mandible bones

  4. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  5. Formal specifications for safety grade systems

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Smith, B.T.; Wojcik, A.S.

    1992-01-01

    The authors describe the findings of a study into the application of formal methods to the specification of a safety system for an operating nuclear reactor. They developed a formal specification that is used to verify and validate that no unsafe condition will result from action or inaction of the system. For this reason, the specification must facilitate thinking about, talking about, and implementing the system. In fact, the specification must provide a bridge between people (designers, engineers, policy makers) and diverse implementations (hardware, software, sensors, power supplies) at all levels. For a specification to serve as an effective linkage, it must have the following properties: (1) completeness, (2) conciseness, (3) unambiguity, and (4) communicativeness. In this paper they describe the development of a specification that has three properties. This development is based on the use of formal methods, i.e., methods that add mathematical rigor to the development, analysis and operation of computer systems and to applications based thereon (Neumann). They demonstrate that a specification derived from a formal basis facilitates development of the design and its subsequent verification

  6. The role of computers in developing countries with reference to East Africa

    International Nuclear Information System (INIS)

    Shayo, L.K.

    1984-01-01

    The role of computers in economic and technological development is examined with particular reference to developing countries. It is stressed that these countries must exploit the potential of computers in their strive to catch-up in the development race. The shortage of qualified EDP personnel is singled out as one of the most critical factors in any unsatisfactory state of computer applications. A computerization policy based on the demands for information by the sophistication of the development process, and supported by a sufficient core of qualified local manpower, is recommended. The situation in East Africa is discussed and recommendations for training and production of telematics equipment are made. (author)

  7. Computational model design specification for Phase 1 of the Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.

    1991-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs.

  8. Computational model design specification for Phase 1 of the Hanford Environmental Dose Reconstruction Project

    International Nuclear Information System (INIS)

    Napier, B.A.

    1991-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs

  9. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    Science.gov (United States)

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  10. SYSTEM AND ACTIVITY APPROACH TO THE PROBLEMS SOLUTION OF COMPUTER COMPETENCE DEVELOPMENT OF FUTURE TEACHERS OF VOCATIONAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Yelena E. Neupokoeva

    2016-01-01

    Full Text Available The scientific specification of system and activity approach in relation to conditions of the educational process aimed at the development of computer competence is presented in the article. The task of efficiency increase of training process in information technologies of future teachers of vocational education in area of economy and management is set in the present research. Thus the attention of authors is focused on involving the trainees’ creative potential as much as possible through the application of model of the operated creative process.Results. Research was conducted at the Russian State Vocational Pedagogical University; the main methods of research are a method of supervision, questioning and conversation. By results of research, the specifying steps to areas of technology of the organization of educational process, approaches and techniques are made; didactic materials and manuals that led to increase of results of educational process which main objective is an improvement of quality of formation of computer competence are developed.Scientific novelty. Though separate elements of this research have already appeared in scientific literature, but in such combination, in relation to this technique of the organization of educational process, with use of the user computer hermeneutics (the term is for the first time used in such definition researches have not been conducted yet.Practical significance. The practical importance of work originates in a problem of development of computer competence of future teachers of professional education and plays a key role in the development of continuous education in the conditions of a general computerization and large-scale development of a global network. 

  11. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  12. Development of a theory-guided pan-European computer-assisted safer sex intervention.

    Science.gov (United States)

    Nöstlinger, Christiana; Borms, Ruth; Dec-Pietrowska, Joanna; Dias, Sonia; Rojas, Daniela; Platteau, Tom; Vanden Berghe, Wim; Kok, Gerjo

    2016-12-01

    HIV is a growing public health problem in Europe, with men-having-sex-with-men and migrants from endemic regions as the most affected key populations. More evidence on effective behavioral interventions to reduce sexual risk is needed. This article describes the systematic development of a theory-guided computer-assisted safer sex intervention, aiming at supporting people living with HIV in sexual risk reduction. We applied the Intervention Mapping (IM) protocol to develop this counseling intervention in the framework of a European multicenter study. We conducted a needs assessment guided by the information-motivation-behavioral (IMB) skills model, formulated change objectives and selected theory-based methods and practical strategies, i.e. interactive computer-assisted modules as supporting tools for provider-delivered counseling. Theoretical foundations were the IMB skills model, social cognitive theory and the transtheoretical model, complemented by dual process models of affective decision making to account for the specifics of sexual behavior. The counseling approach for delivering three individual sessions was tailored to participants' needs and contexts, adopting elements of motivational interviewing and cognitive-behavioral therapy. We implemented and evaluated the intervention using a randomized controlled trial combined with a process evaluation. IM provided a useful framework for developing a coherent intervention for heterogeneous target groups, which was feasible and effective across the culturally diverse settings. This article responds to the need for transparent descriptions of the development and content of evidence-based behavior change interventions as potential pillars of effective combination prevention strategies. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    Cavina, A.

    2013-01-01

    This series of slides presents the IAEA policy concerning the development of recommendations and guidelines for computer security at nuclear facilities. A document of the Nuclear Security Series dedicated to this issue is on the final stage prior to publication. This document is the the first existing IAEA document specifically addressing computer security. This document was necessary for 3 mains reasons: first not all national infrastructures have recognized and standardized computer security, secondly existing international guidance is not industry specific and fails to capture some of the key issues, and thirdly the presence of more or less connected digital systems is increasing in the design of nuclear power plants. The security of computer system must be based on a graded approach: the assignment of computer system to different levels and zones should be based on their relevance to safety and security and the risk assessment process should be allowed to feed back into and influence the graded approach

  14. Laboratory Works Designed for Developing Student Motivation in Computer Architecture

    Directory of Open Access Journals (Sweden)

    Petre Ogrutan

    2017-02-01

    Full Text Available In light of the current difficulties related to maintaining the students’ interest and to stimulate their motivation for learning, the authors have developed a range of new laboratory exercises intended for first-year students in Computer Science as well as for engineering students after completion of at least one course in computers. The educational goal of the herein proposed laboratory exercises is to enhance the students’ motivation and creative thinking by organizing a relaxed yet competitive learning environment. The authors have developed a device including LEDs and switches, which is connected to a computer. By using assembly language, commands can be issued to flash several LEDs and read the states of the switches. The effectiveness of this idea was confirmed by a statistical study.

  15. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  16. Concept of development of integrated computer - based control system for 'Ukryttia' object

    International Nuclear Information System (INIS)

    Buyal'skij, V.M.; Maslov, V.P.

    2003-01-01

    The structural concept of Chernobyl NPP 'Ukryttia' Object's integrated computer - based control system development is presented on the basis of general concept of integrated Computer - based Control System (CCS) design process for organizing and technical management subjects.The concept is aimed at state-of-the-art architectural design technique application and allows using modern computer-aided facilities for functional model,information (logical and physical) models development,as well as for system object model under design

  17. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  18. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  19. Computer-Aided Template for Model Reuse, Development and Maintenance

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2014-01-01

    A template-based approach for model development is presented in this work. Based on a model decomposition technique, the computer-aided template concept has been developed. This concept is implemented as a software tool , which provides a user-friendly interface for following the workflow steps...

  20. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  1. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  2. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    Science.gov (United States)

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or

  3. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  4. [Development of domain specific search engines].

    Science.gov (United States)

    Takai, T; Tokunaga, M; Maeda, K; Kaminuma, T

    2000-01-01

    As cyber space exploding in a pace that nobody has ever imagined, it becomes very important to search cyber space efficiently and effectively. One solution to this problem is search engines. Already a lot of commercial search engines have been put on the market. However these search engines respond with such cumbersome results that domain specific experts can not tolerate. Using a dedicate hardware and a commercial software called OpenText, we have tried to develop several domain specific search engines. These engines are for our institute's Web contents, drugs, chemical safety, endocrine disruptors, and emergent response for chemical hazard. These engines have been on our Web site for testing.

  5. Design and development of a connection of a magnetic drum with two computers

    International Nuclear Information System (INIS)

    Malriq, Jean.

    1976-01-01

    In the experiment in High Energy Physics 'Hyperons SPS' realized at the CERN 300 GeV the connection between a drum and two computers is studied. One of the two computers is a NORD-10 the other may be anyone if it has a CAMAC interface. The input/output structure of the NORD-10 has been studied, then the interface NORD-drum and the CAMAC-drum interface have been realized. The electronic commutation allowing each computer to be connected on the drum through its specific interface has been built [fr

  6. Development of the Tensoral Computer Language

    Science.gov (United States)

    Ferziger, Joel; Dresselhaus, Eliot

    1996-01-01

    The research scientist or engineer wishing to perform large scale simulations or to extract useful information from existing databases is required to have expertise in the details of the particular database, the numerical methods and the computer architecture to be used. This poses a significant practical barrier to the use of simulation data. The goal of this research was to develop a high-level computer language called Tensoral, designed to remove this barrier. The Tensoral language provides a framework in which efficient generic data manipulations can be easily coded and implemented. First of all, Tensoral is general. The fundamental objects in Tensoral represent tensor fields and the operators that act on them. The numerical implementation of these tensors and operators is completely and flexibly programmable. New mathematical constructs and operators can be easily added to the Tensoral system. Tensoral is compatible with existing languages. Tensoral tensor operations co-exist in a natural way with a host language, which may be any sufficiently powerful computer language such as Fortran, C, or Vectoral. Tensoral is very-high-level. Tensor operations in Tensoral typically act on entire databases (i.e., arrays) at one time and may, therefore, correspond to many lines of code in a conventional language. Tensoral is efficient. Tensoral is a compiled language. Database manipulations are simplified optimized and scheduled by the compiler eventually resulting in efficient machine code to implement them.

  7. Left fronto-temporal dynamics during agreement processing: evidence for feature-specific computations.

    Science.gov (United States)

    Molinaro, Nicola; Barber, Horacio A; Pérez, Alejandro; Parkkonen, Lauri; Carreiras, Manuel

    2013-09-01

    Grammatical agreement is a widespread language phenomenon that indicates formal syntactic relations between words; however, it also conveys basic lexical (e.g. grammatical gender) or semantic (e.g. numerosity) information about a discourse referent. In this study, we focus on the reading of Spanish noun phrases, violating either number or gender determiner-noun agreement compared to grammatical controls. Magnetoencephalographic activity time-locked to the onset of the noun in both types of violation revealed a left-lateralized brain network involving anterior temporal regions (~220 ms) and, later in time, ventro-lateral prefrontal regions (>300 ms). These activations coexist with dependency-specific effects: in an initial step (~170 ms), occipito-temporal regions are employed for fine-grained analysis of the number marking (in Spanish, presence or absence of the suffix '-s'), while anterior temporal regions show increased activation for gender mismatches compared to grammatical controls. The semantic relevance of number agreement dependencies was mainly reflected by left superior temporal increased activity around 340 ms. These findings offer a detailed perspective on the multi-level analyses involved in the initial computation of agreement dependencies, and theoretically support a derivational approach to agreement computation. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  9. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  10. The impact of home computer use on children's activities and development.

    Science.gov (United States)

    Subrahmanyam, K; Kraut, R E; Greenfield, P M; Gross, E F

    2000-01-01

    The increasing amount of time children are spending on computers at home and school has raised questions about how the use of computer technology may make a difference in their lives--from helping with homework to causing depression to encouraging violent behavior. This article provides an overview of the limited research on the effects of home computer use on children's physical, cognitive, and social development. Initial research suggests, for example, that access to computers increases the total amount of time children spend in front of a television or computer screen at the expense of other activities, thereby putting them at risk for obesity. At the same time, cognitive research suggests that playing computer games can be an important building block to computer literacy because it enhances children's ability to read and visualize images in three-dimensional space and track multiple images simultaneously. The limited evidence available also indicates that home computer use is linked to slightly better academic performance. The research findings are more mixed, however, regarding the effects on children's social development. Although little evidence indicates that the moderate use of computers to play games has a negative impact on children's friendships and family relationships, recent survey data show that increased use of the Internet may be linked to increases in loneliness and depression. Of most concern are the findings that playing violent computer games may increase aggressiveness and desensitize a child to suffering, and that the use of computers may blur a child's ability to distinguish real life from simulation. The authors conclude that more systematic research is needed in these areas to help parents and policymakers maximize the positive effects and to minimize the negative effects of home computers in children's lives.

  11. The development of AR book for computer learning

    Science.gov (United States)

    Phadung, Muneeroh; Wani, Najela; Tongmnee, Nur-aiynee

    2017-08-01

    Educators need to provide the alternative educational tools to foster learning outcomes of students. By using AR technology to create exciting edutainment experiences, this paper presents how augmented reality (AR) can be applied in the education. This study aims to develop the AR book for tenth grade students (age 15-16) and evaluate its quality. The AR book was developed based on ADDIE framework processes to provide computer learning on software computer knowledge. The content was accorded with the current Thai education curriculum. The AR book had 10 pages in three topics (the first was "Introduction," the second was "System Software" and the third was "Application Software"). Each page contained markers that placed virtual objects (2D animation and video clip). The obtained data were analyzed in terms of average and standard deviation. The validity of multimedia design of the AR book was assessed by three experts in multimedia design. A five-point Likert scale was used and the values were X¯ =4 .84 , S.D. = 1.27 which referred to very high. Moreover, three content experts, who specialize in computer teaching, evaluated the AR book's validity. The values determined by the experts were X¯ =4 .69 , S.D. = 0.29 which referred to very high. Implications for future study and education are discussed.

  12. Development of Specifications for Radioactive Waste Packages

    International Nuclear Information System (INIS)

    2006-10-01

    The main objective of this publication is to provide guidelines for the development of waste package specifications that comply with waste acceptance requirements for storage and disposal of radioactive waste. It will assist waste generators and waste package producers in selecting the most significant parameters and in developing and implementing specifications for each individual type of waste and waste package. This publication also identifies and reviews the activities and technical provisions that are necessary to meet safety requirements; in particular, selection of the significant safety parameters and preparation of specifications for waste forms, waste containers and waste packages using proven approaches, methods and technologies. This report provides guidance using a systematic, stepwise approach, integrating the technical, organizational and administrative factors that need to be considered at each step of planning and implementing waste package design, fabrication, approval, quality assurance and control. The report reflects the considerable experience and knowledge that has been accumulated in the IAEA Member States and is consistent with the current international requirements, principles, standards and guidance for the safe management of radioactive waste

  13. Development of Specifications for Radioactive Waste Packages

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-10-15

    The main objective of this publication is to provide guidelines for the development of waste package specifications that comply with waste acceptance requirements for storage and disposal of radioactive waste. It will assist waste generators and waste package producers in selecting the most significant parameters and in developing and implementing specifications for each individual type of waste and waste package. This publication also identifies and reviews the activities and technical provisions that are necessary to meet safety requirements; in particular, selection of the significant safety parameters and preparation of specifications for waste forms, waste containers and waste packages using proven approaches, methods and technologies. This report provides guidance using a systematic, stepwise approach, integrating the technical, organizational and administrative factors that need to be considered at each step of planning and implementing waste package design, fabrication, approval, quality assurance and control. The report reflects the considerable experience and knowledge that has been accumulated in the IAEA Member States and is consistent with the current international requirements, principles, standards and guidance for the safe management of radioactive waste.

  14. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    Computed Tomography Scans by Autumn R Kulaga, Kathryn L Loftis, and Eric Murray Approved for public release; distribution is...Army Research Laboratory Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans by Autumn R Kulaga...Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  15. Species-Specific Mechanisms of Neuron Subtype Specification Reveal Evolutionary Plasticity of Amniote Brain Development

    Directory of Open Access Journals (Sweden)

    Tadashi Nomura

    2018-03-01

    Full Text Available Summary: Highly ordered brain architectures in vertebrates consist of multiple neuron subtypes with specific neuronal connections. However, the origin of and evolutionary changes in neuron specification mechanisms remain unclear. Here, we report that regulatory mechanisms of neuron subtype specification are divergent in developing amniote brains. In the mammalian neocortex, the transcription factors (TFs Ctip2 and Satb2 are differentially expressed in layer-specific neurons. In contrast, these TFs are co-localized in reptilian and avian dorsal pallial neurons. Multi-potential progenitors that produce distinct neuronal subtypes commonly exist in the reptilian and avian dorsal pallium, whereas a cis-regulatory element of avian Ctip2 exhibits attenuated transcription suppressive activity. Furthermore, the neuronal subtypes distinguished by these TFs are not tightly associated with conserved neuronal connections among amniotes. Our findings reveal the evolutionary plasticity of regulatory gene functions that contribute to species differences in neuronal heterogeneity and connectivity in developing amniote brains. : Neuronal heterogeneity is essential for assembling intricate neuronal circuits. Nomura et al. find that species-specific transcriptional mechanisms underlie diversities of excitatory neuron subtypes in mammalian and non-mammalian brains. Species differences in neuronal subtypes and connections suggest functional plasticity of regulatory genes for neuronal specification during amniote brain evolution. Keywords: Ctip2, Satb2, multi-potential progenitors, transcriptional regulation, neuronal connectivity

  16. Development of a patient-specific anatomical foot model from structured light scan data.

    Science.gov (United States)

    Lochner, Samuel J; Huissoon, Jan P; Bedi, Sanjeev S

    2014-01-01

    The use of anatomically accurate finite element (FE) models of the human foot in research studies has increased rapidly in recent years. Uses for FE foot models include advancing knowledge of orthotic design, shoe design, ankle-foot orthoses, pathomechanics, locomotion, plantar pressure, tissue mechanics, plantar fasciitis, joint stress and surgical interventions. Similar applications but for clinical use on a per-patient basis would also be on the rise if it were not for the high costs associated with developing patient-specific anatomical foot models. High costs arise primarily from the expense and challenges of acquiring anatomical data via magnetic resonance imaging (MRI) or computed tomography (CT) and reconstructing the three-dimensional models. The proposed solution morphs detailed anatomy from skin surface geometry and anatomical landmarks of a generic foot model (developed from CT or MRI) to surface geometry and anatomical landmarks acquired from an inexpensive structured light scan of a foot. The method yields a patient-specific anatomical foot model at a fraction of the cost of standard methods. Average error for bone surfaces was 2.53 mm for the six experiments completed. Highest accuracy occurred in the mid-foot and lowest in the forefoot due to the small, irregular bones of the toes. The method must be validated in the intended application to determine if the resulting errors are acceptable.

  17. Development of the radiosynthesis of high-specific-activity {sup 123}I-NKJ64

    Energy Technology Data Exchange (ETDEWEB)

    Tavares, Adriana Alexandre S., E-mail: a.tavares.1@research.gla.ac.u [Institute of Neuroscience and Psychology, College of Medical, Veterinary and Life Sciences, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Jobson, Nicola K. [WestCHEM, School of Chemistry, The Joseph Black Building, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Dewar, Deborah [Institute of Neuroscience and Psychology, College of Medical, Veterinary and Life Sciences, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Sutherland, Andrew [WestCHEM, School of Chemistry, The Joseph Black Building, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Pimlott, Sally L. [West of Scotland Radionuclide Dispensary, University of Glasgow and North Glasgow University Hospital NHS Trust, G11 6NT Glasgow (United Kingdom)

    2011-05-15

    Introduction: {sup 123}I-NKJ64, a reboxetine analogue, is currently under development as a potential novel single photon emission computed tomography radiotracer for imaging the noradrenaline transporter in brain. This study describes the development of the radiosynthesis of {sup 123}I-NKJ64, highlighting the advantages and disadvantages, pitfalls and solutions encountered while developing the final radiolabelling methodology. Methods: The synthesis of {sup 123}I-NKJ64 was evaluated using an electrophilic iododestannylation method, where a Boc-protected trimethylstannyl precursor was radioiodinated using peracetic acid as an oxidant and deprotection was investigated using either trifluoroacetic acid (TFA) or 2 M hydrochloric acid (HCl). Results: Radioiodination of the Boc-protected trimethylstannyl precursor was achieved with an incorporation yield of 92{+-}6%. Deprotection with 2 M HCl produced {sup 123}I-NKJ64 with the highest radiochemical yield of 98.05{+-}1.63% compared with 83.95{+-}13.24% with TFA. However, the specific activity of the obtained {sup 123}I-NKJ64 was lower when measured after using 2 M HCl (0.15{+-}0.23 Ci/{mu}mol) as the deprotecting agent in comparison to TFA (1.76{+-}0.60 Ci/{mu}mol). Further investigation of the 2 M HCl methodology found a by-product, identified as the deprotected proto-destannylated precursor, which co-eluted with {sup 123}I-NKJ64 during the high-performance liquid chromatography (HPLC) purification. Conclusions: The radiosynthesis of {sup 123}I-NKJ64 was achieved with good isolated radiochemical yield of 68% and a high specific activity of 1.8 Ci/{mu}mol. TFA was found to be the most suitable deprotecting agent, since 2 M HCl generated a by-product that could not be fully separated from {sup 123}I-NKJ64 using the HPLC methodology investigated. This study highlights the importance of HPLC purification and accurate measurement of specific activity while developing new radiosynthesis methodologies.

  18. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  19. Development of a patient-specific two-compartment anthropomorphic breast phantom

    International Nuclear Information System (INIS)

    Prionas, Nicolas D; Burkett, George W; McKenney, Sarah E; Chen, Lin; Boone, John M; Stern, Robin L

    2012-01-01

    The purpose of this paper is to develop a technique for the construction of a two-compartment anthropomorphic breast phantom specific to an individual patient's pendant breast anatomy. Three-dimensional breast images were acquired on a prototype dedicated breast computed tomography (bCT) scanner as part of an ongoing IRB-approved clinical trial of bCT. The images from the breast of a patient were segmented into adipose and glandular tissue regions and divided into 1.59 mm thick breast sections to correspond to the thickness of polyethylene stock. A computer-controlled water-jet cutting machine was used to cut the outer breast edge and the internal regions corresponding to glandular tissue from the polyethylene. The stack of polyethylene breast segments was encased in a thermoplastic ‘skin’ and filled with water. Water-filled spaces modeled glandular tissue structures and the surrounding polyethylene modeled the adipose tissue compartment. Utility of the phantom was demonstrated by inserting 200 µm microcalcifications as well as by measuring point dose deposition during bCT scanning. Affine registration of the original patient images with bCT images of the phantom showed similar tissue distribution. Linear profiles through the registered images demonstrated a mean coefficient of determination (r 2 ) between grayscale profiles of 0.881. The exponent of the power law describing the anatomical noise power spectrum was identical in the coronal images of the patient's breast and the phantom. Microcalcifications were visualized in the phantom at bCT scanning. The real-time air kerma rate was measured during bCT scanning and fluctuated with breast anatomy. On average, point dose deposition was 7.1% greater than the mean glandular dose. A technique to generate a two-compartment anthropomorphic breast phantom from bCT images has been demonstrated. The phantom is the first, to our knowledge, to accurately model the uncompressed pendant breast and the glandular tissue

  20. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  1. Principles for the wise use of computers by children.

    Science.gov (United States)

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  2. Development of a computer writing system based on EOG

    OpenAIRE

    López, A.; Ferrero, F.; Yangüela, D.; Álvarez, C.; Postolache, O.

    2017-01-01

    WOS:000407517600044 (Nº de Acesso Web of Science) The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical i...

  3. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  4. Specifications development for "Karbatril" codenamed tablets

    Directory of Open Access Journals (Sweden)

    L. I. Kucherenko

    2017-08-01

    Full Text Available Introduction. According to current legislation of Ukraine the specifications of tablets include the following indicators: description, identification, average weight, disintegration and assay. The aim of the study. The development of specifications and project of quality control methods for "Karbatril" codenamed tablets. Materials and methods. During the study we analyzed 6 series of tablets "Karbatril." For the description, identification, determination of the average mass, disintegration, active ingredients quantify of "Karbatril" codenamed tablets we used appropriate methods and instruments. Results and discussion. Tablets "Karbatril" were analyzed for the following parameters: - Overview - Tablets white or nearly white; - Average weight - during the study the average weight of 6 series of obtained tablets ranged from 339,0 mg to 369,9 mg according to SPU from 337,0 mg to 373,0 mg; - Disintegration – according to SPU the disintegration for tablet without shell shall not exceed 15 min. Analyzed tablets disintegrated in the period from 5 to 10 minutes; - Identification and quantification of the active ingredients of tablets were conducted using modified HPLC methods. During the identification obtained chromatograms show compliance with SPU. In quantitative determination of the active ingredients content in "Karbatril" codenamed tablets we found carbamazepine from 148.18 mg to 150.19 mg, thiotriazoline - from 98.93 mg to 99.71 mg. This data is consistent to SPU which regulates content of carbamazepine - 150 mg ± 7,5%, thiotriazoline - 100 mg ± 10%. Conclusions. This study has developed specification for "Karbatril" codenamed tablets and also methods of HPLC qualitative and quantitative determination of active ingredients. In the specification the following parameters are included: description, identification, average weight, disintegration and assay. The study drafted quality control methods which are planned to be later offered to the

  5. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  6. Computer-Aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1996-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This system is defined as a Commercial-Off the-Shelf computer dispatching system providing both text and graphical display information while interfacing with the diverse reporting system within the Hanford Facility. This system also provided expansion capabilities to integrate Hanford Fire and the Occurrence Notification Center and provides back-up capabilities for the Plutonium Processing Facility

  7. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  8. Development of a Computer Writing System Based on EOG.

    Science.gov (United States)

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  9. Development of a Computer Writing System Based on EOG

    Directory of Open Access Journals (Sweden)

    Alberto López

    2017-06-01

    Full Text Available The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1 A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2 A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3 A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  10. Development of 3-D Radiosurgery Planning System Using IBM Personal Computer

    International Nuclear Information System (INIS)

    Suh, Tae Suk; Park, Charn Il; Ha, Sung Whan; Kang, Wee Saing; Suh, Doug Young; Park, Sung Hun

    1993-01-01

    Recently, stereotactic radiosurgery plan is required with the information of 3-D image and dose distribution. A project has been doing if developing LINAC based stereotactic radiosurgery since April 1991. The purpose of this research is to develop 3-D radiosurgery planning system using personal computer. The procedure of this research is based on two steps. The first step is to develop 3-D localization system, which input the image information of the patient, coordinate transformation, the position and shape of target, and patient contour into computer system using CT image and stereotactic frame. The second step is to develop 3-D dose planning system, which compute dose distribution on image plane, display on high resolution monitor both isodose distribution and patient image simultaneously and develop menu-driven planning system. This prototype of radiosurgery planning system was applied recently for several clinical cases. It was shown that our planning system is fast, accurate and efficient while making it possible to handle various kinds of image modalities such as angiography, CT and MRI. It makes it possible to develop general 3-D planning system using beam eye view or CT simulation in radiation therapy in future

  11. Development of optimized techniques and requirements for computer enhancement of structural weld radiographs. Volume 1: Technical report

    Science.gov (United States)

    Adams, J. R.; Hawley, S. W.; Peterson, G. R.; Salinger, S. S.; Workman, R. A.

    1971-01-01

    A hardware and software specification covering requirements for the computer enhancement of structural weld radiographs was considered. Three scanning systems were used to digitize more than 15 weld radiographs. The performance of these systems was evaluated by determining modulation transfer functions and noise characteristics. Enhancement techniques were developed and applied to the digitized radiographs. The scanning parameters of spot size and spacing and film density were studied to optimize the information content of the digital representation of the image.

  12. Development of COMPAS, computer aided process flowsheet design and analysis system of nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Homma, Shunji; Sakamoto, Susumu; Takanashi, Mitsuhiro; Nammo, Akihiko; Satoh, Yoshihiro; Soejima, Takayuki; Koga, Jiro; Matsumoto, Shiro

    1995-01-01

    A computer aided process flowsheet design and analysis system, COMPAS has been developed in order to carry out the flowsheet calculation on the process flow diagram of nuclear fuel reprocessing. All of equipments, such as dissolver, mixer-settler, and so on, in the process flowsheet diagram are graphically visualized as icon on a bitmap display of UNIX workstation. Drawing of a flowsheet can be carried out easily by the mouse operation. Not only a published numerical simulation code but also a user's original one can be used on the COMPAS. Specifications of the equipment and the concentration of components in the stream displayed as tables can be edited by a computer user. Results of calculation can be also displayed graphically. Two examples show that the COMPAS is applicable to decide operating conditions of Purex process and to analyze extraction behavior in a mixer-settler extractor. (author)

  13. Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    Science.gov (United States)

    Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza

    2016-01-01

    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.

  14. COMPUTER MODELING IN THE DEVELOPMENT OF ARTIFICIAL VENTRICLES OF HEART

    Directory of Open Access Journals (Sweden)

    L. V. Belyaev

    2011-01-01

    Full Text Available In article modern researches of processes of development of artificial ventricles of heart are described. Advanta- ges of application computer (CAD/CAE technologies are shown by development of artificial ventricles of heart. The systems developed with application of the given technologies are submitted. 

  15. Computational simulation of concurrent engineering for aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  16. Computational simulation for concurrent engineering of aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  17. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    Science.gov (United States)

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Design of Deinococcus radiodurans thioredoxin reductase with altered thioredoxin specificity using computational alanine mutagenesis

    OpenAIRE

    Obiero, Josiah; Sanders, David AR

    2011-01-01

    In this study, the X-ray crystal structure of the complex between Escherichia coli thioredoxin reductase (EC TrxR) and its substrate thioredoxin (Trx) was used as a guide to design a Deinococcus radiodurans TrxR (DR TrxR) mutant with altered Trx specificity. Previous studies have shown that TrxRs have higher affinity for cognate Trxs (same species) than that for Trxs from different species. Computational alanine scanning mutagenesis and visual inspection of the EC TrxR–Trx interface suggested...

  19. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  20. Application of a B ampersand W developed computer aided pictorial process planning system to CQMS for manufacturing process control

    International Nuclear Information System (INIS)

    Johanson, D.C.; VandeBogart, J.E.

    1992-01-01

    Babcock ampersand Wilcox (B ampersand W) will utilize its internally developed Computer Aided Pictorial Process Planning or CAPPP (pronounced open-quotes cap cubedclose quotes) system to create a paperless manufacturing environment for the Collider Quadruple Magnets (CQM). The CAPPP system consists of networked personal computer hardware and software used to: (1) generate and maintain the documents necessary for product fabrication, (2) communicate the information contained in these documents to the production floor, and (3) obtain quality assurance and manufacturing feedback information from the production floor. The purpose of this paper is to describe the various components of the CAPPP system and explain their applicability to product fabrication, specifically quality assurance functions

  1. Trends and developments in computational geometry

    NARCIS (Netherlands)

    Berg, de M.

    1997-01-01

    This paper discusses some trends and achievements in computational geometry during the past five years, with emphasis on problems related to computer graphics. Furthermore, a direction of research in computational geometry is discussed that could help in bringing the fields of computational geometry

  2. Computer-aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.

    1997-12-16

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP).

  3. Computer-aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1997-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP)

  4. Developing Decision-Making Skill: Experiential Learning in Computer Games

    OpenAIRE

    Kurt A. April; Katja M. J. Goebel; Eddie Blass; Jonathan Foster-Pedley

    2012-01-01

    This paper explores the value that computer and video games bring to learning and leadership and explores how games work as learning environments and the impact they have on personal development. The study looks at decisiveness, decision-making ability and styles, and on how this leadership-related skill is learnt through different paradigms. The paper compares the learning from a lecture to the learning from a designed computer game, both of which have the same content through the use of a s...

  5. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  6. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, S.Y.

    1994-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the US Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities

  7. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, Shih-Yew

    1995-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the U.S. Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities. (author)

  8. Developing criteria for performance-based concrete specifications.

    Science.gov (United States)

    2013-07-01

    For more than 50 years now, concrete technology has advanced, but CDOT specifications for durability have : remained mostly unchanged. The minimum cement content for a given strength is derived from mix design : guidelines that were developed before ...

  9. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  10. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  11. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  12. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    Science.gov (United States)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  13. The Experiment Method for Manufacturing Grid Development on Single Computer

    Institute of Scientific and Technical Information of China (English)

    XIAO Youan; ZHOU Zude

    2006-01-01

    In this paper, an experiment method for the Manufacturing Grid application system development in the single personal computer environment is proposed. The characteristic of the proposed method is constructing a full prototype Manufacturing Grid application system which is hosted on a single personal computer with the virtual machine technology. Firstly, it builds all the Manufacturing Grid physical resource nodes on an abstraction layer of a single personal computer with the virtual machine technology. Secondly, all the virtual Manufacturing Grid resource nodes will be connected with virtual network and the application software will be deployed on each Manufacturing Grid nodes. Then, we can obtain a prototype Manufacturing Grid application system which is working in the single personal computer, and can carry on the experiment on this foundation. Compared with the known experiment methods for the Manufacturing Grid application system development, the proposed method has the advantages of the known methods, such as cost inexpensively, operation simple, and can get the confidence experiment result easily. The Manufacturing Grid application system constructed with the proposed method has the high scalability, stability and reliability. It is can be migrated to the real application environment rapidly.

  14. Development of a standard for computer program verification and control

    International Nuclear Information System (INIS)

    Dunn, T.E.; Ozer, O.

    1980-01-01

    It is expected that adherence to the guidelines of the ANS 10.4 will: 1. Provide confidence that the program conforms to its requirements specification; 2. Provide confidence that the computer program has been adequately evaluated and tested; 3. Provide confidence that program changes are adequately evaluated, tested, and controlled; and 4. Enhance assurance that reliable data will be produced for engineering, scientific, and safety analysis purposes

  15. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  16. The graphics future in scientific applications-trends and developments in computer graphics

    CERN Document Server

    Enderle, G

    1982-01-01

    Computer graphics methods and tools are being used to a great extent in scientific research. The future development in this area will be influenced both by new hardware developments and by software advances. On the hardware sector, the development of the raster technology will lead to the increased use of colour workstations with more local processing power. Colour hardcopy devices for creating plots, slides, or movies will be available at a lower price than today. The first real 3D-workstations will appear on the marketplace. One of the main activities on the software sector is the standardization of computer graphics systems, graphical files, and device interfaces. This will lead to more portable graphical application programs and to a common base for computer graphics education.

  17. Symbolic math for computation of radiation shielding

    International Nuclear Information System (INIS)

    Suman, Vitisha; Datta, D.; Sarkar, P.K.; Kushwaha, H.S.

    2010-01-01

    Radiation transport calculations for shielding studies in the field of accelerator technology often involve intensive numerical computations. Traditionally, radiation transport equation is solved using finite difference scheme or advanced finite element method with respect to specific initial and boundary conditions suitable for the geometry of the problem. All these computations need CPU intensive computer codes for accurate calculation of scalar and angular fluxes. Computation using symbols of the analytical expression representing the transport equation as objects is an enhanced numerical technique in which the computation is completely algorithm and data oriented. Algorithm on the basis of symbolic math architecture is developed using Symbolic math toolbox of MATLAB software. Present paper describes the symbolic math algorithm and its application as a case study in which shielding calculation of rectangular slab geometry is studied for a line source of specific activity. Study of application of symbolic math in this domain evolves a new paradigm compared to the existing computer code such as DORT. (author)

  18. An ODP computational model of a cooperative binding object

    Science.gov (United States)

    Logé, Christophe; Najm, Elie; Chen, Ken

    1997-12-01

    A next generation of systems that should appear will have to manage simultaneously several geographically distributed users. These systems belong to the class of computer-supported cooperative work systems (CSCW). The development of such complex systems requires rigorous development methods and flexible open architectures. Open distributed processing (ODP) is a standardization effort that aims at providing such architectures. ODP features appropriate abstraction levels and a clear articulation between requirements, programming and infrastructure support. ODP advocates the use of formal methods for the specification of systems and components. The computational model, an object-based model, one of the abstraction levels identified within ODP, plays a central role in the global architecture. In this model, basic objects can be composed with communication and distribution abstractions (called binding objects) to form a computational specification of distributed systems, or applications. Computational specifications can then be mapped (in a mechanism akin to compilation) onto an engineering solution. We use an ODP-inspired method to computationally specify a cooperative system. We start from a general purpose component that we progressively refine into a collection of basic and binding objects. We focus on two issues of a co-authoring application, namely, dynamic reconfiguration and multiview synchronization. We discuss solutions for these issues and formalize them using the MT-LOTOS specification language that is currently studied in the ISO standardization formal description techniques group.

  19. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  20. Computer Programming Languages for Health Care

    Science.gov (United States)

    O'Neill, Joseph T.

    1979-01-01

    This paper advocates the use of standard high level programming languages for medical computing. It recommends that U.S. Government agencies having health care missions implement coordinated policies that encourage the use of existing standard languages and the development of new ones, thereby enabling them and the medical computing community at large to share state-of-the-art application programs. Examples are based on a model that characterizes language and language translator influence upon the specification, development, test, evaluation, and transfer of application programs.

  1. Initial explorations of ARM processors for scientific computing

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Muzaffar, Shahzad

    2014-01-01

    Power efficiency is becoming an ever more important metric for both high performance and high throughput computing. Over the course of next decade it is expected that flops/watt will be a major driver for the evolution of computer architecture. Servers with large numbers of ARM processors, already ubiquitous in mobile computing, are a promising alternative to traditional x86-64 computing. We present the results of our initial investigations into the use of ARM processors for scientific computing applications. In particular we report the results from our work with a current generation ARMv7 development board to explore ARM-specific issues regarding the software development environment, operating system, performance benchmarks and issues for porting High Energy Physics software

  2. Development of computer program for safety of nuclear power plant against tsunami

    International Nuclear Information System (INIS)

    Jin, S. B.; Choi, K. R.; Lee, S. K.; Cho, Y. S.

    2001-01-01

    The main objective of this study is the development of a computer program to check the safety of nuclear power plants along the coastline of the Korean Peninsula. The computer program describes the propagation and associated run-up process of tsunamis by solving linear and nonlinear shallow-water equations with finite difference methods. The computer program has been applied to several ideal and simplified problems. Obtained numerical solutions are compared to existing and available solutions and measurements. A very good agreement between numerical solutions and existing measurement is observed. The computer program developed in this study can be to check the safety analysis of nuclear power plants against tsunamis. The program can also be used to study the propagation of tsunamis for a long distance, and associated run-up and run-down process along a shoreline. Furthermore, the computer program can be used to provide the proper design criteria of coastal facilities and structures

  3. Computer applications in radiation protection

    International Nuclear Information System (INIS)

    Cole, P.R.; Moores, B.M.

    1995-01-01

    Computer applications in general and diagnostic radiology in particular are becoming more widespread. Their application to the field of radiation protection in medical imaging, including quality control initiatives, is similarly becoming more widespread. Advances in computer technology have enabled departments of diagnostic radiology to have access to powerful yet affordable personal computers. The application of databases, expert systems and computer-based learning is under way. The executive information systems for the management of dose and QA data that are under way at IRS are discussed. An important consideration in developing these pragmatic software tools has been the range of computer literacy within the end user group. Using interfaces have been specifically designed to reflect the requirements of many end users who will have little or no computer knowledge. (Author)

  4. Utilizing Computational Probabilistic Methods to Derive Shock Specifications in a Nondeterministic Environment

    Energy Technology Data Exchange (ETDEWEB)

    FIELD JR.,RICHARD V.; RED-HORSE,JOHN R.; PAEZ,THOMAS L.

    2000-10-25

    One of the key elements of the Stochastic Finite Element Method, namely the polynomial chaos expansion, has been utilized in a nonlinear shock and vibration application. As a result, the computed response was expressed as a random process, which is an approximation to the true solution process, and can be thought of as a generalization to solutions given as statistics only. This approximation to the response process was then used to derive an analytically-based design specification for component shock response that guarantees a balanced level of marginal reliability. Hence, this analytically-based reference SRS might lead to an improvement over the somewhat ad hoc test-based reference in the sense that it will not exhibit regions of conservativeness. nor lead to overtesting of the design.

  5. Computational and mathematical methods in brain atlasing.

    Science.gov (United States)

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  6. Formulation, development and evaluation of colon-specific ketorolac ...

    African Journals Online (AJOL)

    The major intention to formulate and develop colon targeted tablets is to improve the therapeutic efficacy by increasing therapeutic drug concentrations in colon. The present study was aimed to develop guar gum compression coated tablets ketorolac tromethamine to achieve the colon-specific drug release. In this study ...

  7. Developing a New Computer Game Attitude Scale for Taiwanese Early Adolescents

    Science.gov (United States)

    Liu, Eric Zhi-Feng; Lee, Chun-Yi; Chen, Jen-Huang

    2013-01-01

    With ever increasing exposure to computer games, gaining an understanding of the attitudes held by young adolescents toward such activities is crucial; however, few studies have provided scales with which to accomplish this. This study revisited the Computer Game Attitude Scale developed by Chappell and Taylor in 1997, reworking the overall…

  8. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  9. Computer codes developed in FRG to analyse hypothetical meltdown accidents

    International Nuclear Information System (INIS)

    Hassmann, K.; Hosemann, J.P.; Koerber, H.; Reineke, H.

    1978-01-01

    It is the purpose of this paper to give the status of all significant computer codes developed in the core melt-down project which is incorporated in the light water reactor safety research program of the Federal Ministry of Research and Technology. For standard pressurized water reactors, results of some computer codes will be presented, describing the course and the duration of the hypothetical core meltdown accident. (author)

  10. Amorphous computing in the presence of stochastic disturbances.

    Science.gov (United States)

    Chu, Dominique; Barnes, David J; Perkins, Samuel

    2014-11-01

    Amorphous computing is a non-standard computing paradigm that relies on massively parallel execution of computer code by a large number of small, spatially distributed, weakly interacting processing units. Over the last decade or so, amorphous computing has attracted a great deal of interest both as an alternative model of computing and as an inspiration to understand developmental biology. A number of algorithms have been developed that can take advantage of the massive parallelism of this computing paradigm to solve specific problems. One of the interesting properties of amorphous computers is that they are robust with respect to the loss of individual processing units, in the sense that a removal of some of them should not impact on the computation as a whole. However, much less understood is to what extent amorphous computers are robust with respect to minor disturbances to the individual processing units, such as random motion or occasional faulty computation short of total component failure. In this article we address this question. As an example problem we choose an algorithm to calculate a straight line between two points. Using this example, we find that amorphous computers are not in general robust with respect to Brownian motion and noise, but we find strategies that restore reliable computation even in their presence. We will argue that these strategies are generally applicable and not specific to the particular AC we consider, or even specific to electronic computers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    Directory of Open Access Journals (Sweden)

    P. O. Umenne

    2012-12-01

    Full Text Available Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ execution were developed at the University of Surrey, UK in the 90s. The objective of the research was to develop a software-based computer architecture on which Agents execution could be explored. The combination of Intelligent Agents and HYDRA computer architecture gave rise to a new computer concept: the NET-Computer in which the comput­ing resources reside on the Internet. The Internet computers form the hardware and software resources, and the user is provided with a simple interface to access the Internet and run user tasks. The Agents autonomously roam the Internet (NET-Computer executing the tasks. A growing segment of the Internet is E-Commerce for online shopping for products and services. The Internet computing resources provide a marketplace for product suppliers and consumers alike. Consumers are looking for suppliers selling products and services, while suppliers are looking for buyers. Searching the vast amount of information available on the Internet causes a great deal of problems for both consumers and suppliers. Intelligent Agents executing on the NET-Computer can surf through the Internet and select specific information of interest to the user. The simulation results show that Intelligent Agents executing HYDRA computer architecture could be applied in E-Commerce.

  12. Computer-Aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.

    1996-09-27

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This document outlines the negotiated requirements as agreed to by GTE Northwest during technical contract discussions. This system defines a commercial off-the-shelf computer dispatching system providing both test and graphic display information while interfacing with diverse alarm reporting system within the Hanford Site. This system provided expansion capability to integrate Hanford Fire and the Occurrence Notification Center. The system also provided back-up capability for the Plutonium Processing Facility (PFP).

  13. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    Science.gov (United States)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  14. Integrated computer-aided design in automotive development development processes, geometric fundamentals, methods of CAD, knowledge-based engineering data management

    CERN Document Server

    Mario, Hirz; Gfrerrer, Anton; Lang, Johann

    2013-01-01

    The automotive industry faces constant pressure to reduce development costs and time while still increasing vehicle quality. To meet this challenge, engineers and researchers in both science and industry are developing effective strategies and flexible tools by enhancing and further integrating powerful, computer-aided design technology. This book provides a valuable overview of the development tools and methods of today and tomorrow. It is targeted not only towards professional project and design engineers, but also to students and to anyone who is interested in state-of-the-art computer-aided development. The book begins with an overview of automotive development processes and the principles of virtual product development. Focusing on computer-aided design, a comprehensive outline of the fundamentals of geometry representation provides a deeper insight into the mathematical techniques used to describe and model geometrical elements. The book then explores the link between the demands of integrated design pr...

  15. Risk-based analysis methods applied to nuclear power plant technical specifications

    International Nuclear Information System (INIS)

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-01-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications

  16. A remote sensing computer-assisted learning tool developed using the unified modeling language

    Science.gov (United States)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  17. The use of micro-computers in the simulation of ion beam optics

    International Nuclear Information System (INIS)

    Spaedtke, P.; Ivens, D.

    1989-01-01

    With computer simulation codes specific problems of the ion beam optics can be studied, which is useful in the design as in optimization of existing systems. Several such codes have been developed, unfortunately requiring substantial computer resources. Recent advances of mini- and micro-computers have now made it possible to develop simulation codes which can be run on these small computers also. In this paper, some of these codes will be presented and their computing time discussed. (author)

  18. Stimulus specificity of a steady-state visual-evoked potential-based brain-computer interface

    Science.gov (United States)

    Ng, Kian B.; Bradley, Andrew P.; Cunnington, Ross

    2012-06-01

    The mechanisms of neural excitation and inhibition when given a visual stimulus are well studied. It has been established that changing stimulus specificity such as luminance contrast or spatial frequency can alter the neuronal activity and thus modulate the visual-evoked response. In this paper, we study the effect that stimulus specificity has on the classification performance of a steady-state visual-evoked potential-based brain-computer interface (SSVEP-BCI). For example, we investigate how closely two visual stimuli can be placed before they compete for neural representation in the cortex and thus influence BCI classification accuracy. We characterize stimulus specificity using the four stimulus parameters commonly encountered in SSVEP-BCI design: temporal frequency, spatial size, number of simultaneously displayed stimuli and their spatial proximity. By varying these quantities and measuring the SSVEP-BCI classification accuracy, we are able to determine the parameters that provide optimal performance. Our results show that superior SSVEP-BCI accuracy is attained when stimuli are placed spatially more than 5° apart, with size that subtends at least 2° of visual angle, when using a tagging frequency of between high alpha and beta band. These findings may assist in deciding the stimulus parameters for optimal SSVEP-BCI design.

  19. Development of a dose assessment computer code for the NPP severe accident

    International Nuclear Information System (INIS)

    Cheong, Jae Hak

    1993-02-01

    A real-time emergency dose assessment computer code called KEDA (KAIST NPP Emergency Dose Assessment) has been developed for the NPP severe accident. A new mathematical model which can calculate cloud shine has been developed and implemented in the code. KEDA considers the specific Korean situations(complex topography, orientals' thyroid metabolism, continuous washout, etc.), and provides functions of dose-monitoring and automatic decision-making. To verify the code results, KEDA has been compared with an NRC officially certified code, RASCAL, for eight hypertical accident scenarios. Through the comparison, KEDA has been proved to provide reasonable results. Qualitative sensitivity analysis also the been performed for potentially important six input parameters, and the trends of the dose v.s. down-wind distance curve have been analyzed comparing with the physical phenomena occurred in the real atmosphere. The source term and meteorological conditions are turned out to be the most important input parameters. KEDA also has been applied to simulate Kori site and a hyperthetical accident with semi-real meteorological data has been simulated and analyzed

  20. Using Animation to Support the Teaching of Computer Game Development Techniques

    Science.gov (United States)

    Taylor, Mark John; Pountney, David C.; Baskett, M.

    2008-01-01

    In this paper, we examine the potential use of animation for supporting the teaching of some of the mathematical concepts that underlie computer games development activities, such as vector and matrix algebra. An experiment was conducted with a group of UK undergraduate computing students to compare the perceived usefulness of animated and static…

  1. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach

    Science.gov (United States)

    Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London. PMID:26609369

  2. Distributed user interfaces for clinical ubiquitous computing applications.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Berglund, Erik; Eriksson, Henrik

    2005-08-01

    Ubiquitous computing with multiple interaction devices requires new interface models that support user-specific modifications to applications and facilitate the fast development of active workspaces. We have developed NOSTOS, a computer-augmented work environment for clinical personnel to explore new user interface paradigms for ubiquitous computing. NOSTOS uses several devices such as digital pens, an active desk, and walk-up displays that allow the system to track documents and activities in the workplace. We present the distributed user interface (DUI) model that allows standalone applications to distribute their user interface components to several devices dynamically at run-time. This mechanism permit clinicians to develop their own user interfaces and forms to clinical information systems to match their specific needs. We discuss the underlying technical concepts of DUIs and show how service discovery, component distribution, events and layout management are dealt with in the NOSTOS system. Our results suggest that DUIs--and similar network-based user interfaces--will be a prerequisite of future mobile user interfaces and essential to develop clinical multi-device environments.

  3. Development and evaluation of a computer-based medical work assessment programme

    Directory of Open Access Journals (Sweden)

    Spallek Michael

    2008-12-01

    Full Text Available Abstract Background There are several ways to conduct a job task analysis in medical work environments including pencil-paper observations, interviews and questionnaires. However these methods implicate bias problems such as high inter-individual deviations and risks of misjudgement. Computer-based observation helps to reduce these problems. The aim of this paper is to give an overview of the development process of a computer-based job task analysis instrument for real-time observations to quantify the job tasks performed by physicians working in different medical settings. In addition reliability and validity data of this instrument will be demonstrated. Methods This instrument was developed in consequential steps. First, lists comprising tasks performed by physicians in different care settings were classified. Afterwards content validity of task lists was proved. After establishing the final task categories, computer software was programmed and implemented in a mobile personal computer. At least inter-observer reliability was evaluated. Two trained observers recorded simultaneously tasks of the same physician. Results Content validity of the task lists was confirmed by observations and experienced specialists of each medical area. Development process of the job task analysis instrument was completed successfully. Simultaneous records showed adequate interrater reliability. Conclusion Initial results of this analysis supported the validity and reliability of this developed method for assessing physicians' working routines as well as organizational context factors. Based on results using this method, possible improvements for health professionals' work organisation can be identified.

  4. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  5. Results from Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, K.

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  6. Results From Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, Kevin [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  7. Computer Sciences Applied to Management at Open University of Catalonia: Development of Competences of Teamworks

    Science.gov (United States)

    Pisa, Carlos Cabañero; López, Enric Serradell

    Teamwork is considered one of the most important professional skills in today's business environment. More specifically, the collaborative work between professionals and information technology managers from various functional areas is a strategic key in competitive business. Several university-level programs are focusing on developing these skills. This article presents the case of the course Computer Science Applied to Management (hereafter CSAM) that has been designed with the objective to develop the ability to work cooperatively in interdisciplinary teams. For their design and development have been addressed to the key elements of efficiency that appear in the literature, most notably the establishment of shared objectives and a feedback system, the management of the harmony of the team, their level of autonomy, independence, diversity and level of supervision. The final result is a subject in which, through a working virtual platform, interdisciplinary teams solve a problem raised by a case study.

  8. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Energy Technology Data Exchange (ETDEWEB)

    Brower, Richard [Boston U.; Christ, Norman [Columbia U.; DeTar, Carleton [Utah U.; Edwards, Robert [Jefferson Lab; Mackenzie, Paul [Fermilab

    2017-10-30

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  9. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Science.gov (United States)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  10. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Directory of Open Access Journals (Sweden)

    Brower Richard

    2018-01-01

    Full Text Available In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020’s. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  11. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  12. A Brain–Computer Interface for Potential Nonverbal Facial Communication Based on EEG Signals Related to Specific Emotions

    Directory of Open Access Journals (Sweden)

    Koji eKashihara

    2014-08-01

    Full Text Available Unlike assistive technology for verbal communication, the brain–machine or brain–computer interface (BMI/BCI has not been established as a nonverbal communication tool for amyotrophic lateral sclerosis (ALS patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG signals can be used to detect patients’ emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based nonverbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus. This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals.

  13. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  14. Ethical Issues in Brain-Computer Interface Research, Development, and Dissemination

    NARCIS (Netherlands)

    Vlek, Rutger; Steines, David; Szibbo, Dyana; Kübler, Andrea; Schneider, Mary-Jane; Haselager, Pim; Nijboer, Femke

    The steadily growing field of brain-computer interfacing (BCI) may develop useful technologies, with a potential impact not only on individuals, but also on society as a whole. At the same time, the development of BCI presents significant ethical and legal challenges. In a workshop during the 4th

  15. Ethical Issues in Brain-Computer Interface Research, Development, and Dissemination

    NARCIS (Netherlands)

    Vlek, R.J.; Steines, D.; Szibbo, D.; Kübler, A.; Schneider, M.J.; Haselager, W.F.G.; Nijboer, F.

    2012-01-01

    The steadily growing field of brain–computer interfacing (BCI) may develop useful technologies, with a potential impact not only on individuals, but also on society as a whole. At the same time, the development of BCI presents significant ethical and legal challenges. In a workshop during the 4th

  16. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  17. Development of DUST: A computer code that calculates release rates from a LLW disposal unit

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1992-01-01

    Performance assessment of a Low-Level Waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the disposal unit source term). The major physical processes that influence the source term are water flow, container degradation, waste form leaching, and radionuclide transport. A computer code, DUST (Disposal Unit Source Term) has been developed which incorporates these processes in a unified manner. The DUST code improves upon existing codes as it has the capability to model multiple container failure times, multiple waste form release properties, and radionuclide specific transport properties. Verification studies performed on the code are discussed

  18. Technical specifications requirements: Automated reasoning applications

    International Nuclear Information System (INIS)

    Lidsky, L.M.; Dobrzeniecki, A.B.

    1990-03-01

    Several software systems were developed and tested to determine what advantages could be gained from explicitly translating complicated regulatory requirements into computerized relationships. The Technical Specifications for US nuclear power plants were chosen as the test-bed application domain, and two analysis systems were developed to monitor plant compliance with operational limits, and track and schedule equipment test and maintenance activities mandated by Technical Specifications. Choosing PROLOG as the computer language to represent these regulatory requirements resulted in a natural match between the semantic structure of the written specifications and the corollary coded rules. Additional research results affirmed the utility of declarative programming styles, explicit management of problem complexity, and attention to the robustness and flexibility of the overall software systems. 5 refs., 2 figs

  19. Physical computation and cognitive science

    CERN Document Server

    Fresco, Nir

    2014-01-01

    This book presents a study of digital computation in contemporary cognitive science. Digital computation is a highly ambiguous concept, as there is no common core definition for it in cognitive science. Since this concept plays a central role in cognitive theory, an adequate cognitive explanation requires an explicit account of digital computation. More specifically, it requires an account of how digital computation is implemented in physical systems. The main challenge is to deliver an account encompassing the multiple types of existing models of computation without ending up in pancomputationalism, that is, the view that every physical system is a digital computing system. This book shows that only two accounts, among the ones examined by the author, are adequate for explaining physical computation. One of them is the instructional information processing account, which is developed here for the first time.   “This book provides a thorough and timely analysis of differing accounts of computation while adv...

  20. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography

    International Nuclear Information System (INIS)

    Djurdjevic, Tanja; Gizewski, Elke Ruth; Grams, Astrid Ellen; Rehwald, Rafael; Glodny, Bernhard; Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan

    2017-01-01

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. (orig.)

  1. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Djurdjevic, Tanja; Gizewski, Elke Ruth; Grams, Astrid Ellen [Medical University of Innsbruck, Department of Neuroradiology, Innsbruck (Austria); Rehwald, Rafael; Glodny, Bernhard [Medical University of Innsbruck, Department of Radiology, Innsbruck (Austria); Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan [Medical University of Innsbruck, Department of Neurology, Innsbruck (Austria)

    2017-03-15

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. (orig.)

  2. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  3. Young Children's Computer Skills Development from Kindergarten to Third Grade

    Science.gov (United States)

    Sackes, Mesut; Trundle, Kathy Cabe; Bell, Randy L.

    2011-01-01

    This investigation explores young children's computer skills development from kindergarten to third grade using the Early Childhood Longitudinal Study-Kindergarten (ECLS-K) dataset. The sample size of the study was 8642 children. Latent growth curve modeling analysis was used as an analytical tool to examine the development of children's computer…

  4. PHYSICAL EDUCATION AND INDIVIDUAL CHARACTERISTICS OF THE AGE SPECIFIC DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Е. М. Revenko

    2017-01-01

    Full Text Available The aim of this paper is scientific substantiation of the importance of the individual characteristics of the age specific youth development which will result in the rational modelling of students’ physical education.Methodology and research methods. The methods involve collection of experimental data carried out by means of evaluation of motor abilities and general intelligence of students. Motor abilities of students were studied by measuring of strength (dead lift dynamometry, strength endurance (pull-up, speed and power abilities (standing jump, as well as speed ability (running 30, 60 or 100 m, depending on age, aerobic endurance (running 1000 or 3000 m, depending on age. The dynamics of integral physical preparedness (DIPP of each student was calculated by calculation the arithmetic mean values of the growth rates of the development of motor abilities. Assessment of General Intelligence (GI of the 8th, 10th and 11th-grades school pupils as well as the 1st to 3rd year students was carried out through the test of R. Amthauer in the adaptation of L. A. Yazykova, and school pupils of the 6th grade were assessed through the Intelligent Test (GIT.Results. Discrepancies in the dynamics of the mental and motor areas development of maturing personality, which are interpreted as individual characteristics of the age specific development are experimentally revealed. Individual psychological differences leading to the different susceptibility to the development of motor and intellectual abilities appearing in adolescence and early adolescence are analysed. A leading role of activity in formation of the individual characteristics of the age specific development is substantiated. The conclusion of necessity to formulate to the students differing in individual characteristics of the age specific development differentiated in the complexity requirements and motor tasks in the course of physical training is made.Scientific novelty. For the first time

  5. Subject-specific computational modeling of DBS in the PPTg area

    Directory of Open Access Journals (Sweden)

    Laura M. Zitella

    2015-07-01

    Full Text Available Deep brain stimulation (DBS in the pedunculopontine tegmental nucleus (PPTg has been proposed to alleviate medically intractable gait difficulties associated with Parkinson’s disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20Hz, 90µs pulse width was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4mA. These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts, which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.

  6. Development of industrial x-ray computed tomography and its application to refractories

    International Nuclear Information System (INIS)

    Aiba, Yoshiro; Oki, Kazuo; Nakamura, Shigeo; Fujii, Masashi.

    1985-01-01

    An industrial X-ray computed tomography was developed under the influence of the rapid spread of the use of the X-ray CT scanner in the medical field and improvements of the equipment. Although current nondestructive testing machines of refractories use the ultrasonic inspection method or the X-ray fluoroscopic method, these equipments cannot produce a tomogram or cannot carry out quantitative evaluation. By using an industrial X-ray computed tomography, submerged nozzles for continuous casting of steel were analyzed with interesting results. The features of the industrial X-ray computed tomography applied for refractory nozzles are as follows: (1) It promptly detects interior defects. (2) It can measure dimensions and shapes. (3) It can numerically express the distribution of density. Accordingly, it is expected that the industrial X-ray computed tomography will widely be used in the fields of development and quality control of refractories and advanced ceramic materials. (author)

  7. The Development of Educational and/or Training Computer Games for Students with Disabilities

    Science.gov (United States)

    Kwon, Jungmin

    2012-01-01

    Computer and video games have much in common with the strategies used in special education. Free resources for game development are becoming more widely available, so lay computer users, such as teachers and other practitioners, now have the capacity to develop games using a low budget and a little self-teaching. This article provides a guideline…

  8. Inference of cancer-specific gene regulatory networks using soft computing rules.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2010-03-24

    Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer) using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  9. Complex Osteotomies of Tibial Plateau Malunions Using Computer-Assisted Planning and Patient-Specific Surgical Guides.

    Science.gov (United States)

    Fürnstahl, Philipp; Vlachopoulos, Lazaros; Schweizer, Andreas; Fucentese, Sandro F; Koch, Peter P

    2015-08-01

    The accurate reduction of tibial plateau malunions can be challenging without guidance. In this work, we report on a novel technique that combines 3-dimensional computer-assisted planning with patient-specific surgical guides for improving reliability and accuracy of complex intraarticular corrective osteotomies. Preoperative planning based on 3-dimensional bone models was performed to simulate fragment mobilization and reduction in 3 cases. Surgical implementation of the preoperative plan using patient-specific cutting and reduction guides was evaluated; benefits and limitations of the approach were identified and discussed. The preliminary results are encouraging and show that complex, intraarticular corrective osteotomies can be accurately performed with this technique. For selective patients with complex malunions around the tibia plateau, this method might be an attractive option, with the potential to facilitate achieving the most accurate correction possible.

  10. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-01-01

    of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation

  11. Sex-specific effects of low-dose gestational estradiol-17β exposure on bone development in porcine offspring

    International Nuclear Information System (INIS)

    Flöter, Veronika L.; Galateanu, Gabriela; Fürst, Rainer W.; Seidlová-Wuttke, Dana; Wuttke, Wolfgang; Möstl, Erich; Hildebrandt, Thomas B.

    2016-01-01

    Highlights: • Sex-specific effects and non-monotonic dose responses were demonstrated after low-dose in utero E2 treatment in offspring. • Alterations in bone parameters were found in prepubertal male but not female offspring. • In postpubertal female offspring, cortical and total cross-sectional area were higher at the femoral midpoint. • In utero E2 treatment did neither significantly affect hormone concentrations nor puberty onset in offspring. • The results substantiate the high sensitivity of developing organisms to exogenous estrogens. - Abstract: Estrogens are important for the bone development and health. Exposure to endocrine disrupting chemicals during the early development has been shown to affect the bone phenotype later in life. Several studies have been performed in rodents, while in larger animals that are important to bridge the gap to humans there is a paucity of data. To this end, the pig as large animal model was used in the present study to assess the influence of gestational estradiol-17β (E2) exposure on the bone development of the prepubertal and adult offspring. Two low doses (0.05 and 10 μg E2/kg body weight) referring to the ‘acceptable daily intake’ (ADI) and the ‘no observed effect level’ (NOEL) as stated for humans, and a high-dose (1000 μg E2/kg body weight), respectively, were fed to the sows every day from insemination until delivery. In the male prepubertal offspring, the ADI dose group had a lower strength strain index (p = 0.002) at the proximal tibia compared to controls, which was determined by peripheral quantitative computed tomography. Prepubertal females were not significantly affected. However, there was a higher cortical cross-sectional area (CSA) (p = 0.03) and total CSA (p = 0.02) at the femur midpoint in the adult female offspring of the NOEL dose group as measured by computed tomography. These effects were independent from plasma hormone concentrations (leptin, IGF1, estrogens), which remained

  12. Computational biomechanics for medicine from algorithms to models and applications

    CERN Document Server

    Joldes, Grand; Nielsen, Poul; Doyle, Barry; Miller, Karol

    2017-01-01

    This volume comprises the latest developments in both fundamental science and patient-specific applications, discussing topics such as: cellular mechanics; injury biomechanics; biomechanics of heart and vascular system; medical image analysis; and both patient-specific fluid dynamics and solid mechanics simulations. With contributions from researchers world-wide, the Computational Biomechanics for Medicine series of titles provides an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements.

  13. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design

    International Nuclear Information System (INIS)

    Menges, Achim

    2012-01-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies. (paper)

  14. Experiment-specific analyses in support of code development

    International Nuclear Information System (INIS)

    Ott, L.J.

    1990-01-01

    Experiment-specific models have been developed since 1986 by Oak Ridge National Laboratory Boiling Water Reactor (BWR) severe accident analysis programs for the purpose of BWR experimental planning and optimum interpretation of experimental results. These experiment-specific models have been applied to large integral tests (ergo, experiments) which start from an initial undamaged core state. The tests performed to date in BWR geometry have had significantly different-from-prototypic boundary and experimental conditions because of either normal facility limitations or specific experimental constraints. These experiments (ACRR: DF-4, NRU: FLHT-6, and CORA) were designed to obtain specific phenomenological information such as the degradation and interaction of prototypic components and the effects on melt progression of control-blade materials and channel boxes. Applications of ORNL models specific to the ACRR DF-4 and KfK CORA-16 experiments are discussed and significant findings from the experimental analyses are presented. 32 refs., 16 figs

  15. A study on decision-making framework for developing risk-informed technical specifications

    International Nuclear Information System (INIS)

    Kim, Beom Seock

    2002-02-01

    The utility and the nuclear research institutes in Korea have conduct research for improving inefficient requirements in technical specifications using the results of probability risk assessments and information associated with risk. However, the guidance for reviewing the improved technical specifications has not been developed. Thus, the objective of this study is to develop a decision-making framework for investigating and reviewing the documents associated with the changes of technical specifications. This work has been done for helping the regulation agency to review the improved technical specifications as well as to make decisions whether the remedy is accepted or not. The contents of this study include: 1. Surveys on Technical Specification regulations in foreign countries as well as those in Korea 2. Surveys on the state- of- the- art methodology for Risk Informed Technical Specifications and their uses in Korea 3. Development of a decision-making framework in both the licensee and the regulation agency position 4. Development and applications of a decision-making framework using Influence Diagrams. The decision-making framework for RITS using Influence Diagrams are developed and applied to an example problem in this study. This work might contribute to developing the risk informed regulation guidance for improving the quality of the current technical specifications

  16. CONTRIBUTIONS FOR DEVELOPING OF A COMPUTER AIDED LEARNING ENVIRONMENT OF DESCRIPTIVE GEOMETRY

    Directory of Open Access Journals (Sweden)

    Antonescu Ion

    2009-07-01

    Full Text Available The paper presents the authors’ contributions for developing a computer code for teaching of descriptive geometry using the computer aided learning techniques. The program was implemented using the programming interface and the 3D modeling capabilities of the AutoCAD system.

  17. Towards Domain-specific Flow-based Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert; Sarjoughian, Hessam S.

    2018-01-01

    describe their problems and solutions, instead of using general purpose programming languages. The goal of these languages is to improve the productivity and efficiency of the development and simulation of concurrent scientific models and systems. Moreover, they help to expose parallelism and to specify...... the concurrency within a component or across different independent components. In this paper, we introduce the concept of domain-specific flowbased languages which allows domain experts to use flow-based languages adapted to a particular problem domain. Flow-based programming is used to support concurrency, while......Due to the significant growth of the demand for data-intensive computing, in addition to the emergence of new parallel and distributed computing technologies, scientists and domain experts are leveraging languages specialized for their problem domain, i.e., domain-specific languages, to help them...

  18. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    Science.gov (United States)

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  19. The development of a computer technique for the investigation of reactor lattice parameters

    International Nuclear Information System (INIS)

    Joubert, W.R.

    1982-01-01

    An integrated computer technique was developed whereby all the computer programmes needed to calculate reactor lattice parameters from basic neutron data, could be combined in one system. The theory of the computer programmes is explained in detail. Results are given and compared with experimental values as well as those calculated with a standard system

  20. Developing Activities for Teaching Cloud Computing and Virtualization

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2014-10-01

    Full Text Available Cloud computing and virtualization are new but indispensable components of computer engineering and information systems curricula for universities and higher education institutions. Learning about these topics is important for students preparing to work in the IT industry. In many companies, information technology operates under tight financial constraints. Virtualization, (for example storage, desktop, and server virtualization, reduces overall IT costs through the consolidation of systems. It also results in reduced loads and energy savings in terms of the power and cooling infrastructure. Therefore it is important to investigate the practical aspects of this topic both for industry practice and for teaching purposes. This paper demonstrates some activities undertaken recently by students at the Eastern Institute of Technology New Zealand and concludes with general recommendations for IT educators, software developers, and other IT professionals

  1. Computer system organization the B5700/B6700 series

    CERN Document Server

    Organick, Elliott I

    1973-01-01

    Computer System Organization: The B5700/B6700 Series focuses on the organization of the B5700/B6700 Series developed by Burroughs Corp. More specifically, it examines how computer systems can (or should) be organized to support, and hence make more efficient, the running of computer programs that evolve with characteristically similar information structures.Comprised of nine chapters, this book begins with a background on the development of the B5700/B6700 operating systems, paying particular attention to their hardware/software architecture. The discussion then turns to the block-structured p

  2. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  3. Development of a model of machine hand eye coordination and program specifications for a topological machine vision system

    Science.gov (United States)

    1972-01-01

    A unified approach to computer vision and manipulation is developed which is called choreographic vision. In the model, objects to be viewed by a projected robot in the Viking missions to Mars are seen as objects to be manipulated within choreographic contexts controlled by a multimoded remote, supervisory control system on Earth. A new theory of context relations is introduced as a basis for choreographic programming languages. A topological vision model is developed for recognizing objects by shape and contour. This model is integrated with a projected vision system consisting of a multiaperture image dissector TV camera and a ranging laser system. System program specifications integrate eye-hand coordination and topological vision functions and an aerospace multiprocessor implementation is described.

  4. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  5. Recent developments and applications in mathematics and computer science

    International Nuclear Information System (INIS)

    Churchhouse, R.F.; Tahir Shah, K.; Zanella, P.

    1991-01-01

    The book contains 8 invited lectures and 4 short seminars presented at the College on Recent Developments and Applications in Mathematics and Computer Science held in Trieste from 7 May to 1 June 1990. A separate abstract was prepared for each paper. Refs, figs and tabs

  6. Crossing the chasm: how to develop weather and climate models for next generation computers?

    Science.gov (United States)

    Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon

    2018-05-01

    Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages

  7. Towards playful learning and computational thinking — Developing the educational robot BRICKO

    DEFF Research Database (Denmark)

    Pedersen, B. K. M. K.; Andersen, K. E.; J⊘rgensen, A.

    2018-01-01

    Educational Robotics has proven a feasible way of supporting and exemplifying Computational Thinking. With this paper, we describe the user-centered iterative and incremental development of a new educational robotic system, BRICKO, to support tangible, social and playful interaction while educating...... children in 1st–3rd grade in Computational Thinking. We develop the system through seven main iterations including a total of 108 participant pupils and their teachers. The methodology is a mixture of observation and interviews using Wizard of OZ testing with the early pilot prototypes as well as usability...... categories of command-bricks. We discuss the methodologies used for assuring a playful and social educational robotic system and conclude that we achieved a useful prototype for supporting education in Computational Thinking....

  8. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  9. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  10. The Computational Development of Reinforcement Learning during Adolescence.

    Directory of Open Access Journals (Sweden)

    Stefano Palminteri

    2016-06-01

    Full Text Available Adolescence is a period of life characterised by changes in learning and decision-making. Learning and decision-making do not rely on a unitary system, but instead require the coordination of different cognitive processes that can be mathematically formalised as dissociable computational modules. Here, we aimed to trace the developmental time-course of the computational modules responsible for learning from reward or punishment, and learning from counterfactual feedback. Adolescents and adults carried out a novel reinforcement learning paradigm in which participants learned the association between cues and probabilistic outcomes, where the outcomes differed in valence (reward versus punishment and feedback was either partial or complete (either the outcome of the chosen option only, or the outcomes of both the chosen and unchosen option, were displayed. Computational strategies changed during development: whereas adolescents' behaviour was better explained by a basic reinforcement learning algorithm, adults' behaviour integrated increasingly complex computational features, namely a counterfactual learning module (enabling enhanced performance in the presence of complete feedback and a value contextualisation module (enabling symmetrical reward and punishment learning. Unlike adults, adolescent performance did not benefit from counterfactual (complete feedback. In addition, while adults learned symmetrically from both reward and punishment, adolescents learned from reward but were less likely to learn from punishment. This tendency to rely on rewards and not to consider alternative consequences of actions might contribute to our understanding of decision-making in adolescence.

  11. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier

    , they should have a large range of applicability for a large class of specifications or programs. Only general ideas could become the basis for an automatic system for program development. Bob’s APTS system is indeed the incarnation of most of the techniques he proposed (cf. Leonard and Heitmeyer...... specification, expressed in SCR notation, into C. Two translation strategies are discussed in the paper. Both were implemented using Bob Paige’s APTS programtransformation system. “Computational Divided Differencing and Divided-Difference Arithmetics” uses an approach conceptually similar to the Computational...

  12. Computational fluid dynamics (CFD) insights into agitation stress methods in biopharmaceutical development.

    Science.gov (United States)

    Bai, Ge; Bee, Jared S; Biddlecombe, James G; Chen, Quanmin; Leach, W Thomas

    2012-02-28

    Agitation of small amounts of liquid is performed routinely in biopharmaceutical process, formulation, and packaging development. Protein degradation commonly results from agitation, but the specific stress responsible or degradation mechanism is usually not well understood. Characterization of the agitation stress methods is critical to identifying protein degradation mechanisms or specific sensitivities. In this study, computational fluid dynamics (CFD) was used to model agitation of 1 mL of fluid by four types of common laboratory agitation instruments, including a rotator, orbital shaker, magnetic stirrer and vortex mixer. Fluid stresses in the bulk liquid and near interfaces were identified, quantified and compared. The vortex mixer provides the most intense stresses overall, while the stir bar system presented locally intense shear proximal to the hydrophobic stir bar surface. The rotator provides gentler fluid stresses, but the air-water interfacial area and surface stresses are relatively high given its low rotational frequency. The orbital shaker provides intermediate-level stresses but with the advantage of a large stable platform for consistent vial-to-vial homogeneity. Selection of experimental agitation methods with targeted types and intensities of stresses can facilitate better understanding of protein degradation mechanisms and predictability for "real world" applications. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    Science.gov (United States)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  14. CERN School of Computing

    CERN Multimedia

    2007-01-01

    The 2007 CERN School of Computing, organised by CERN in collaboration with the University of Split (FESB) will be held from 20 to 31 August 2007 in Dubrovnik, Croatia. It is aimed at postgraduate students and research workers with a few years' experience in scientific physics, computing or related fields. Special themes this year are: GRID Technologies: The Grid track delivers unique theoretical and hands-on education on some of the most advanced GRID topics; Software Technologies: The Software track addresses some of the most relevant modern techniques and tools for large scale distributed software development and handling as well as for computer security; Physics Computing: The Physics Computing track focuses on informatics topics specific to the HEP community. After setting-the-scene lectures, it addresses data acquisition and ROOT. Grants from the European Union Framework Programme 6 (FP6) are available to participants to cover part or all of the cost of the School. More information can be found at...

  15. Development of a computer-based pulsed NMR thermometer

    International Nuclear Information System (INIS)

    Hobeika, Alexandre; Haard, T.M.; Hoskinson, E.M.; Packard, R.E.

    2003-01-01

    We have designed a fully computer-controlled pulsed NMR system, using the National Instruments PCI-6115 data acquisition board. We use it for millikelvin thermometry and have developed a special control program, written in LabVIEW, for this purpose. It can perform measurements of temperature via the susceptibility or the τ 1 dependence. This system requires little hardware, which makes it very versatile, easily reproducible and customizable

  16. Isolation of developing secondary xylem specific cellulose synthase ...

    Indian Academy of Sciences (India)

    The present study aimed at identifying developing secondary xylem specific cellulose synthase genes from .... the First strand cDNA synthesis kit (Fermentas, Pittsburgh,. USA). .... ing height of the rooted cutting, girth of the stem, leaf area.

  17. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  18. Development of a scanning proton microprobe - computer-control, elemental mapping and applications

    International Nuclear Information System (INIS)

    Loevestam, Goeran.

    1989-08-01

    A scanning proton microprobe set-up has been developed at the Pelletron accelerator in Lund. A magnetic beam scanning system and a computer-control system for beam scanning and data aquisition is described. The computer system consists of a VMEbus front-end computer and a μVax-II host-computer, interfaced by means of a high-speed data link. The VMEbus computer controls data acquisition, beam charge monitoring and beam scanning while the more sophisticated work of elemental mapping and spectrum evaluations is left to the μVax-II. The calibration of the set-up is described as well as several applications. Elemental micro patterns in tree rings and bark has been investigated by means of elemental mapping and quantitative analysis. Large variations of elemental concentrations have been found for several elements within a single tree ring. An external beam set-up has been developed in addition to the proton microprobe set-up. The external beam has been used for the analysis of antique papyrus documents. Using a scanning sample procedure and particle induced X-ray emission (PIXE) analysis, damaged and missing characters of the text could be made visible by means of multivariate statistical data evaluation and elemental mapping. Also aspects of elemental mapping by means of scanning μPIXE analysis are discussed. Spectrum background, target thickness variations and pile-up are shown to influence the structure of elemental maps considerably. In addition, a semi-quantification procedure has been developed. (author)

  19. Nuclear computational science a century in review

    CERN Document Server

    Azmy, Yousry

    2010-01-01

    Nuclear engineering has undergone extensive progress over the years. In the past century, colossal developments have been made and with specific reference to the mathematical theory and computational science underlying this discipline, advances in areas such as high-order discretization methods, Krylov Methods and Iteration Acceleration have steadily grown. Nuclear Computational Science: A Century in Review addresses these topics and many more; topics which hold special ties to the first half of the century, and topics focused around the unique combination of nuclear engineering, computational

  20. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    Science.gov (United States)

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  1. Brain systems for probabilistic and dynamic prediction: computational specificity and integration.

    Directory of Open Access Journals (Sweden)

    Jill X O'Reilly

    2013-09-01

    Full Text Available A computational approach to functional specialization suggests that brain systems can be characterized in terms of the types of computations they perform, rather than their sensory or behavioral domains. We contrasted the neural systems associated with two computationally distinct forms of predictive model: a reinforcement-learning model of the environment obtained through experience with discrete events, and continuous dynamic forward modeling. By manipulating the precision with which each type of prediction could be used, we caused participants to shift computational strategies within a single spatial prediction task. Hence (using fMRI we showed that activity in two brain systems (typically associated with reward learning and motor control could be dissociated in terms of the forms of computations that were performed there, even when both systems were used to make parallel predictions of the same event. A region in parietal cortex, which was sensitive to the divergence between the predictions of the models and anatomically connected to both computational networks, is proposed to mediate integration of the two predictive modes to produce a single behavioral output.

  2. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  3. International Developments in Computer Science.

    Science.gov (United States)

    1982-06-01

    background on 52 53 China’s scientific research and on their computer science before 1978. A useful companion to the directory is another publication of the...bimonthly publication in Portuguese; occasional translation of foreign articles into Portuguese. Data News: A bimonthly industry newsletter. Sistemas ...computer-related topics; Spanish. Delta: Publication of local users group; Spanish. Sistemas : Publication of System Engineers of Colombia; Spanish. CUBA

  4. Development of a lion-specific interferon-gamma assay.

    Science.gov (United States)

    Maas, M; van Kooten, P J S; Schreuder, J; Morar, D; Tijhaar, E; Michel, A L; Rutten, V P M G

    2012-10-15

    The ongoing spread of bovine tuberculosis (BTB) in African free-ranging lion populations, for example in the Kruger National Park, raises the need for diagnostic assays for BTB in lions. These, in addition, would be highly relevant for zoological gardens worldwide that want to determine the BTB status of their lions, e.g. for translocations. The present study concerns the development of a lion-specific IFN-γ assay, following the production and characterization of monoclonal antibodies specific for lion interferon-gamma (IFN-γ). Recombinant lion IFN-γ (rLIFN-γ) was produced in mammalian cells and used to immunize mice to establish hybridoma cell lines producing monoclonal antibodies. These were used to develop a sensitive, lion IFN-γ-specific capture ELISA, able to detect rLIFN-γ to the level of 160 pg/ml. Recognition of native lion IFN-γ was shown in an initial assessment of supernatants of mitogen stimulated whole blood cultures of 11 known BTB-negative lions. In conclusion, the capture ELISA shows potential as a diagnostic assay for bovine tuberculosis in lions. Preliminary results also indicate the possible use of the test for other (feline) species. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Design, Development, and Evaluation of a Mobile Learning Application for Computing Education

    Science.gov (United States)

    Oyelere, Solomon Sunday; Suhonen, Jarkko; Wajiga, Greg M.; Sutinen, Erkki

    2018-01-01

    The study focused on the application of the design science research approach in the course of developing a mobile learning application, MobileEdu, for computing education in the Nigerian higher education context. MobileEdu facilitates the learning of computer science courses on mobile devices. The application supports ubiquitous, collaborative,…

  6. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  7. Green Cloud Computing: A Literature Survey

    Directory of Open Access Journals (Sweden)

    Laura-Diana Radu

    2017-11-01

    Full Text Available Cloud computing is a dynamic field of information and communication technologies (ICTs, introducing new challenges for environmental protection. Cloud computing technologies have a variety of application domains, since they offer scalability, are reliable and trustworthy, and offer high performance at relatively low cost. The cloud computing revolution is redesigning modern networking, and offering promising environmental protection prospects as well as economic and technological advantages. These technologies have the potential to improve energy efficiency and to reduce carbon footprints and (e-waste. These features can transform cloud computing into green cloud computing. In this survey, we review the main achievements of green cloud computing. First, an overview of cloud computing is given. Then, recent studies and developments are summarized, and environmental issues are specifically addressed. Finally, future research directions and open problems regarding green cloud computing are presented. This survey is intended to serve as up-to-date guidance for research with respect to green cloud computing.

  8. Enabling Customization through Web Development: An Iterative Study of the Dell Computer Corporation Website

    Science.gov (United States)

    Liu, Chang; Mackie, Brian G.

    2008-01-01

    Throughout the last decade, companies have increased their investment in electronic commerce (EC) by developing and implementing Web-based applications on the Internet. This paper describes a class project to develop a customized computer website which is similar to Dell Computer Corporation's (Dell) website. The objective of this project is to…

  9. Development of a proton Computed Tomography Detector System

    Energy Technology Data Exchange (ETDEWEB)

    Naimuddin, Md. [Delhi U.; Coutrakon, G. [Northern Illinois U.; Blazey, G. [Northern Illinois U.; Boi, S. [Northern Illinois U.; Dyshkant, A. [Northern Illinois U.; Erdelyi, B. [Northern Illinois U.; Hedin, D. [Northern Illinois U.; Johnson, E. [Northern Illinois U.; Krider, J. [Northern Illinois U.; Rukalin, V. [Northern Illinois U.; Uzunyan, S. A. [Northern Illinois U.; Zutshi, V. [Northern Illinois U.; Fordt, R. [Fermilab; Sellberg, G. [Fermilab; Rauch, J. E. [Fermilab; Roman, M. [Fermilab; Rubinov, P. [Fermilab; Wilson, P. [Fermilab

    2016-02-04

    Computer tomography is one of the most promising new methods to image abnormal tissues inside the human body. Tomography is also used to position the patient accurately before radiation therapy. Hadron therapy for treating cancer has become one of the most advantegeous and safe options. In order to fully utilize the advantages of hadron therapy, there is a necessity of performing radiography with hadrons as well. In this paper we present the development of a proton computed tomography system. Our second-generation proton tomography system consists of two upstream and two downstream trackers made up of fibers as active material and a range detector consisting of plastic scintillators. We present details of the detector system, readout electronics, and data acquisition system as well as the commissioning of the entire system. We also present preliminary results from the test beam of the range detector.

  10. Ready To Buy a Computer?

    Science.gov (United States)

    Rourke, Martha; Rourke, Patrick

    1974-01-01

    The school district business manager can make sound, cost-conscious decisions in the purchase of computer equipment by developing a list of cost-justified applications for automation, considering the software, writing performance specifications for bidding or negotiating a contract, and choosing the vendor wisely prior to the purchase; and by…

  11. Montessori Transformation at Computer Associates.

    Science.gov (United States)

    Mars, Lisa

    2002-01-01

    Describes the growth of the all-day Montessori program for children ages 6 weeks to 6 years at Computer Associates' corporate headquarters and multiple sites worldwide. Focuses on placement of AMI Montessori-trained teachers, refurbishing of the child development centers to fit Montessori specifications, and the Nido--the children's community--and…

  12. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    Science.gov (United States)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  13. Etiological factors for developing carpal tunnel syndrome in people who work with computers

    Directory of Open Access Journals (Sweden)

    Magdalena Lewańska

    2013-02-01

    Full Text Available Background: Carpal tunnel syndrome (CTS is the most frequent mononeuropathy of upper extremities. From the early 1990's it has been suggested that intensive work with computers can result in CTS development, however, this relationship has not as yet been proved. The aim of the study was to evaluate occupational and non-occupational risk factors for developing CTS in the population of computer-users. Material and Methods: The study group comprised 60 patients (58 women and 2 men; mean age: 53.8±6.35 years working with computers and suspected of occupational CTS. A survey as well as both median and ulnar nerve conduction examination (NCS were performed in all the subjects. Results: The patients worked with use of computer for 6.43±1.71h per day. The mean latency between the beginning of employment and the occurrence of first CTS symptoms was 12.09±5.94 years. All patients met the clinical and electrophysiological diagnostic criteria of CTS. In the majority of patients etiological factors for developing CTS were non-occupational: obesity, hypothyroidism, oophorectomy, past hysterectomy, hormonal replacement therapy or oral contraceptives, recent menopause, diabetes, tendovaginitis. In 7 computer-users etiological factors were not identified. Conclusion: The results of our study show that CTS is usually generated by different causes not related with using computers at work. Med Pr 2013;64(1:37–45

  14. Universal Quantum Computing with Arbitrary Continuous-Variable Encoding.

    Science.gov (United States)

    Lau, Hoi-Kwan; Plenio, Martin B

    2016-09-02

    Implementing a qubit quantum computer in continuous-variable systems conventionally requires the engineering of specific interactions according to the encoding basis states. In this work, we present a unified formalism to conduct universal quantum computation with a fixed set of operations but arbitrary encoding. By storing a qubit in the parity of two or four qumodes, all computing processes can be implemented by basis state preparations, continuous-variable exponential-swap operations, and swap tests. Our formalism inherits the advantages that the quantum information is decoupled from collective noise, and logical qubits with different encodings can be brought to interact without decoding. We also propose a possible implementation of the required operations by using interactions that are available in a variety of continuous-variable systems. Our work separates the "hardware" problem of engineering quantum-computing-universal interactions, from the "software" problem of designing encodings for specific purposes. The development of quantum computer architecture could hence be simplified.

  15. Computational assessment of effective dose and patient specific doses for kilovoltage stereotactic radiosurgery of wet age-related macular degeneration

    Science.gov (United States)

    Hanlon, Justin Mitchell

    Age-related macular degeneration (AMD) is a leading cause of vision loss and a major health problem for people over the age of 50 in industrialized nations. The current standard of care, ranibizumab, is used to help slow and in some cases stabilize the process of AMD, but requires frequent invasive injections into the eye. Interest continues for stereotactic radiosurgery (SRS), an option that provides a non-invasive treatment for the wet form of AMD, through the development of the IRay(TM) (Oraya Therapeutics, Inc., Newark, CA). The goal of this modality is to destroy choroidal neovascularization beneath the pigment epithelium via delivery of three 100 kVp photon beams entering through the sclera and overlapping on the macula delivering up to 24 Gy of therapeutic dose over a span of approximately 5 minutes. The divergent x-ray beams targeting the fovea are robotically positioned and the eye is gently immobilized by a suction-enabled contact lens. Device development requires assessment of patient effective dose, reference patient mean absorbed doses to radiosensitive tissues, and patient specific doses to the lens and optic nerve. A series of head phantoms, including both reference and patient specific, was derived from CT data and employed in conjunction with the MCNPX 2.5.0 radiation transport code to simulate treatment and evaluate absorbed doses to potential tissues-at-risk. The reference phantoms were used to evaluate effective dose and mean absorbed doses to several radiosensitive tissues. The optic nerve was modeled with changeable positions based on individual patient variability seen in a review of head CT scans gathered. Patient specific phantoms were used to determine the effect of varying anatomy and gaze. The results showed that absorbed doses to the non-targeted tissues were below the threshold levels for serious complications; specifically the development of radiogenic cataracts and radiation induced optic neuropathy (RON). The effective dose

  16. An Examination of Job-Specific Communication in the Computer Industry.

    Science.gov (United States)

    Kidwell, Michael E.

    Enhancing the awareness of (1) the organization of "typical" computer projects, (2) the communication that emerges from those structures, and (3) the problems that technical communications inherently hold is the purpose of this paper. It begins by presenting the organization of the working group of a computer project (a college that is going to…

  17. New polymorphous computing fabric

    International Nuclear Information System (INIS)

    Wolinski, Christophe; Gokhale, Maya; McCabe, Kevin P.

    2002-01-01

    This paper introduces a new polymorphous computing Fabric well suited to DSP and Image Processing and describes its implementation on a Configurable System on a Chip (CSOC). The architecture is highly parameterized and enables customization of the synthesized Fabric to achieve high performance for a specific class of application. For this reason it can be considered to be a generic model for hardware accelerator synthesis from a high level specification. Another important innovation is the Fabric uses a global memory concept, which gives the host processor random access to all the variables and instructions on the Fabric. The Fabric supports different computing models including MIMD, SPMD and systolic flow and permits dynamic reconfiguration. We present a specific implementation of a bank of FIR filters on a Fabric composed of 52 cells on the Altera Excalibur ARM running at 33 MHz. The theoretical performance of this Fabric is 1.8 GMACh. For the FIR application we obtain 1.6 GMAC/s real performance. Some automatic tools have been developed like the tool to provide a host access utility and assembler.

  18. Scalable Computational Chemistry: New Developments and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri [Iowa State Univ., Ames, IA (United States)

    2002-01-01

    The computational part of the thesis is the investigation of titanium chloride (II) as a potential catalyst for the bis-silylation reaction of ethylene with hexaclorodisilane at different levels of theory. Bis-silylation is an important reaction for producing bis(silyl) compounds and new C-Si bonds, which can serve as monomers for silicon containing polymers and silicon carbides. Ab initio calculations on the steps involved in a proposed mechanism are presented. This choice of reactants allows them to study this reaction at reliable levels of theory without compromising accuracy. The calculations indicate that this is a highly exothermic barrierless reaction. The TiCl2 catalyst removes a 50 kcal/mol activation energy barrier required for the reaction without the catalyst. The first step is interaction of TiCl2 with ethylene to form an intermediate that is 60 kcal/mol below the energy of the reactants. This is the driving force for the entire reaction. Dynamic correlation plays a significant role because RHF calculations indicate that the net barrier for the catalyzed reaction is 50 kcal/mol. They conclude that divalent Ti has the potential to become an important industrial catalyst for silylation reactions. In the programming part of the thesis, parallelization of different quantum chemistry methods is presented. The parallelization of code is becoming important aspects of quantum chemistry code development. Two trends contribute to it: the overall desire to study large chemical systems and the desire to employ highly correlated methods which are usually computationally and memory expensive. In the presented distributed data algorithms computation is parallelized and the largest arrays are evenly distributed among CPUs. First, the parallelization of the Hartree-Fock self-consistent field (SCF) method is considered. SCF method is the most common starting point for more accurate calculations. The Fock build (sub step of SCF) from AO integrals is

  19. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  20. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    Science.gov (United States)

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  1. Development of the computer code system for the analyses of PWR core

    International Nuclear Information System (INIS)

    Tsujimoto, Iwao; Naito, Yoshitaka.

    1992-11-01

    This report is one of the materials for the work titled 'Development of the computer code system for the analyses of PWR core phenomena', which is performed under contracts between Shikoku Electric Power Company and JAERI. In this report, the numerical method adopted in our computer code system are described, that is, 'The basic course and the summary of the analysing method', 'Numerical method for solving the Boltzmann equation', 'Numerical method for solving the thermo-hydraulic equations' and 'Description on the computer code system'. (author)

  2. Inference of Cancer-specific Gene Regulatory Networks Using Soft Computing Rules

    Directory of Open Access Journals (Sweden)

    Xiaosheng Wang

    2010-03-01

    Full Text Available Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  3. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  4. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  5. Feasibility Study of a Generalized Framework for Developing Computer-Aided Detection Systems-a New Paradigm.

    Science.gov (United States)

    Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu

    2017-10-01

    We propose a generalized framework for developing computer-aided detection (CADe) systems whose characteristics depend only on those of the training dataset. The purpose of this study is to show the feasibility of the framework. Two different CADe systems were experimentally developed by a prototype of the framework, but with different training datasets. The CADe systems include four components; preprocessing, candidate area extraction, candidate detection, and candidate classification. Four pretrained algorithms with dedicated optimization/setting methods corresponding to the respective components were prepared in advance. The pretrained algorithms were sequentially trained in the order of processing of the components. In this study, two different datasets, brain MRA with cerebral aneurysms and chest CT with lung nodules, were collected to develop two different types of CADe systems in the framework. The performances of the developed CADe systems were evaluated by threefold cross-validation. The CADe systems for detecting cerebral aneurysms in brain MRAs and for detecting lung nodules in chest CTs were successfully developed using the respective datasets. The framework was shown to be feasible by the successful development of the two different types of CADe systems. The feasibility of this framework shows promise for a new paradigm in the development of CADe systems: development of CADe systems without any lesion specific algorithm designing.

  6. Computer Program Development Specification for Tactical Interface System.

    Science.gov (United States)

    1981-07-31

    CNTL CNTL TO ONE VT~i.AE CR1 & TWELVE VT100 LCARD READER VIDEO TERMINALS, SIX LA12O) HARD- COPY TERMINALS, & VECTOR GRAPHICS RPO % TERMINAL 17%M DISK...this data into the TIS para - .. meter tables in the TISGBL common area. ICEHANDL will send test interface ICE to PSS in one of two modes: perio- dically...STOPCauss te TI sotwar toexit ,9.*9~ .r .~ * ~%.’h .9~ .. a .~ .. a. 1 , , p * % .’.-:. .m 7 P : SDSS-MMP-BI ." 31 July 1981 TCL commands authorized

  7. Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

    International Nuclear Information System (INIS)

    Kim, K. R.; Lee, H. S.; Hwang, I. S.

    2010-12-01

    The objective of this project is to develop multi-dimensional computational models in order to improve the operation of uranium electrorefiners currently used in pyroprocessing technology. These 2-D (US) and 3-D (ROK) mathematical models are based on the fundamental physical and chemical properties of the electrorefiner processes. The validated models by compiled and evaluated experimental data could provide better information for developing advanced electrorefiners for uranium recovery. The research results in this period are as follows: - Successfully assessed a common computational platform for the modeling work and identify spatial characterization requirements. - Successfully developed a 3-D electro-fluid dynamic electrorefiner model. - Successfully validated and benchmarked the two multi-dimensional models with compiled experimental data sets

  8. Computer Game Lugram - Version for Blind Children

    Directory of Open Access Journals (Sweden)

    V. Delić

    2011-06-01

    Full Text Available Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and adaptation of the educational computer game Lugram to the needs of completely blind children, as well as the testing of the prototype, whose results are encouraging to further research and development in the same direction.

  9. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  10. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  11. Nuclear reactor pressure vessel-specific flaw distribution development

    International Nuclear Information System (INIS)

    Rosinski, S.T.

    1992-01-01

    Vessel integrity predictions performed through fracture mechanics analysis of a pressurized thermal shock event have been shown to be significantly sensitive to the overall flaw distribution input. It has also been shown that modem vessel in-service inspection (ISI) results can be used for development of vessel flaw distribution(s) that are more representative of US vessels. This paper describes the development and application of a methodology to analyze ISI data for the purpose of flaw distribution determination. The resultant methodology considers detection reliability, flaw sizing accuracy, and flaw detection threshold in its application. Application of the methodology was then demonstrated using four recently acquired US PWR vessel inspection data sets. Throughout the program, new insight was obtained into several key inspection performance and vessel integrity prediction practice issues that will impact future vessel integrity evaluation. For example, the potential application of a vessel-specific flaw distribution now provides at least one method by which a vessel-specific reference flaw size applicable to pressure-temperature limit curves determination can be estimated. This paper will discuss the development and application of the methodology and the impact to future vessel integrity analyses

  12. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    Science.gov (United States)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  13. Development of computational small animal models and their applications in preclinical imaging and therapy research

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, Groningen 9700 RB (Netherlands)

    2016-01-15

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  14. Development of computational small animal models and their applications in preclinical imaging and therapy research

    International Nuclear Information System (INIS)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future

  15. Development of the Shimadzu computed tomographic scanner SCT-200N

    International Nuclear Information System (INIS)

    Ishihara, Hiroshi; Yamaoka, Nobuyuki; Saito, Masahiro

    1982-01-01

    The Shimadzu Computed Tomographic Scanner SCT-200N has been developed as an ideal CT scanner for diagnosing the head and spine. Due to the large aperture, moderate scan time and the Zoom Scan Mode, any part of the body can be scanned. High quality image can be obtained by adopting the precisely stabilized X-ray unit and densely packed array of 64-detectors. As for its operation, capability of computed radiography (CR) prior to patient positioning and real time reconstruction ensure efficient patient through-put. Details of the SCT-200N are described in this paper. (author)

  16. Dissemination of Information in Developing Countries: The Personal Computer and beyond

    Science.gov (United States)

    Wong, Wai-Man

    2005-01-01

    With the blooming of information in digital format, dissemination of information is becoming a big challenge for developing countries. It is not only due to the limited provision of personal computers--in addition, the technological infrastructure and the ability to access information are also becoming major concerns in developing countries. This…

  17. Development of multimedia computer-based training for VXI integrated fuel monitors

    International Nuclear Information System (INIS)

    Keeffe, R.; Ellacott, T.; Truong, Q.S.

    1999-01-01

    The Canadian Safeguards Support Program has developed the VXI Integrated Fuel Monitor (VFIM) which is based on the international VXI instrument bus standard. This equipment is a generic radiation monitor which can be used in an integrated mode where several detection systems can be connected to a common system where information is collected, displayed, and analyzed via a virtual control panel with the aid of computers, trackball and computer monitor. The equipment can also be used in an autonomous mode as a portable radiation monitor with a very low power consumption. The equipment has been described at previous international symposia. Integration of several monitoring systems (bundle counter, core discharge monitor, and yes/no monitor) has been carried out at Wolsong 2. Performance results from one of the monitoring systems which was installed at CANDU nuclear stations are discussed in a companion paper at this symposium. This paper describes the development of an effective multimedia computer-based training package for the primary users of the equipment; namely IAEA inspectors and technicians. (author)

  18. A computer-controlled conformal radiotherapy system. IV: Electronic chart

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; McShan, Daniel L.; Matrone, Gwynne M.; Weaver, Tamar A.; Lewis, James D.; Kessler, Marc L.

    1995-01-01

    Purpose: The design and implementation of a system for electronically tracking relevant plan, prescription, and treatment data for computer-controlled conformal radiation therapy is described. Methods and Materials: The electronic charting system is implemented on a computer cluster coupled by high-speed networks to computer-controlled therapy machines. A methodical approach to the specification and design of an integrated solution has been used in developing the system. The electronic chart system is designed to allow identification and access of patient-specific data including treatment-planning data, treatment prescription information, and charting of doses. An in-house developed database system is used to provide an integrated approach to the database requirements of the design. A hierarchy of databases is used for both centralization and distribution of the treatment data for specific treatment machines. Results: The basic electronic database system has been implemented and has been in use since July 1993. The system has been used to download and manage treatment data on all patients treated on our first fully computer-controlled treatment machine. To date, electronic dose charting functions have not been fully implemented clinically, requiring the continued use of paper charting for dose tracking. Conclusions: The routine clinical application of complex computer-controlled conformal treatment procedures requires the management of large quantities of information for describing and tracking treatments. An integrated and comprehensive approach to this problem has led to a full electronic chart for conformal radiation therapy treatments

  19. Development of technology performance specifications for volatile organic compounds

    International Nuclear Information System (INIS)

    Purdy, C.; Schutte, W.E.

    1993-01-01

    The Office of Technology Development (OTD) within the Office of Environmental Restoration and Waste Management of the Department of Energy has a mission to deliver needed and usable technologies to its customers. The primary customers are individuals and organizations performing environmental characterization and remediation, waste cleanup, and pollution prevention at DOE sites. DOE faces a monumental task in cleaning up the dozen or so major sites and hundreds of smaller sites that were or are used to produce the US nuclear weapons arsenal and to develop nuclear technologies for national defense and for peaceful purposes. Contaminants and waste materials include the radionuclides associated with nuclear weapons, such as plutonium and tritium, and more common pollutants and wastes of industrial activity such as chromium, chlorinated solvents, and polychlorinated biphenyls (PCBs). Quite frequently hazardous wastes regulated by the Environmental Protection Agency are co-mingled with radioactive wastes regulated by the Nuclear Regulatory Commission to yield a open-quotes mixed waste,close quotes which increases the cleanup challenges from several perspectives. To help OTD and its investigators meet DOE's cleanup goal, technology performance specifications are being implemented for research and development and DT ampersand E projects. Technology performance specifications or open-quotes performance goalsclose quotes describe, quantitatively where possible, the technology development needs being addressed. These specifications are used to establish milestones, evaluate the status of ongoing projects, and determine the success of completed projects

  20. Development of a Computer Application to Simulate Porous Structures

    Directory of Open Access Journals (Sweden)

    S.C. Reis

    2002-09-01

    Full Text Available Geometric modeling is an important tool to evaluate structural parameters as well as to follow the application of stereological relationships. The obtention, visualization and analysis of volumetric images of the structure of materials, using computational geometric modeling, facilitates the determination of structural parameters of difficult experimental access, such as topological and morphological parameters. In this work, we developed a geometrical model implemented by computer software that simulates random pore structures. The number of nodes, number of branches (connections between nodes and the number of isolated parts, are obtained. Also, the connectivity (C is obtained from this application. Using a list of elements, nodes and branches, generated by the software, in AutoCAD® command line format, the obtained structure can be viewed and analyzed.

  1. Continuum mechanical and computational aspects of material behavior

    Energy Technology Data Exchange (ETDEWEB)

    Fried, Eliot; Gurtin, Morton E.

    2000-02-10

    The focus of the work is the application of continuum mechanics to materials science, specifically to the macroscopic characterization of material behavior at small length scales. The long-term goals are a continuum-mechanical framework for the study of materials that provides a basis for general theories and leads to boundary-value problems of physical relevance, and computational methods appropriate to these problems supplemented by physically meaningful regularizations to aid in their solution. Specific studies include the following: the development of a theory of polycrystalline plasticity that incorporates free energy associated with lattice mismatch between grains; the development of a theory of geometrically necessary dislocations within the context of finite-strain plasticity; the development of a gradient theory for single-crystal plasticity with geometrically necessary dislocations; simulations of dynamical fracture using a theory that allows for the kinking and branching of cracks; computation of segregation and compaction in flowing granular materials.

  2. Radiologic total lung capacity measurement. Development and evaluation of a computer-based system

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, G.W.; Mazzeo, J.; Borgstrom, M.; Hunter, T.B.; Newell, J.D.; Bjelland, J.C.

    1986-11-01

    The development of a computer-based radiologic total lung capacity (TLC) measurement system designed to be used by non-physician personnel is detailed. Four operators tested the reliability and validity of the system by measuring inspiratory PA and lateral pediatric chest radiographs with a Graf spark pen interfaced to a DEC VAX 11/780 computer. First results suggest that the ultimate goal of developing an accurate and easy to use TLC measurement system for non-physician personnel is attainable.

  3. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  4. Programming Unconventional Computers: Dynamics, Development, Self-Reference

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference.

  5. Computer-Aided Transformation of PDE Models: Languages, Representations, and a Calculus of Operations

    Science.gov (United States)

    2016-01-05

    Computer-aided transformation of PDE models: languages, representations, and a calculus of operations A domain-specific embedded language called...languages, representations, and a calculus of operations Report Title A domain-specific embedded language called ibvp was developed to model initial...Computer-aided transformation of PDE models: languages, representations, and a calculus of operations 1 Vision and background Physical and engineered systems

  6. Patient-specific surgical planning and hemodynamic computational fluid dynamics optimization through free-form haptic anatomy editing tool (SURGEM).

    Science.gov (United States)

    Pekkan, Kerem; Whited, Brian; Kanter, Kirk; Sharma, Shiva; de Zelicourt, Diane; Sundareswaran, Kartik; Frakes, David; Rossignac, Jarek; Yoganathan, Ajit P

    2008-11-01

    The first version of an anatomy editing/surgical planning tool (SURGEM) targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel three-dimensional (3D) shape editing concepts and human-shape interaction technologies have been integrated to facilitate interactive surgical morphology alterations, grid generation and CFD analysis. In order to implement "manual hemodynamic optimization" at the surgery planning phase for patients with congenital heart defects, these tools are applied to design and evaluate possible modifications of patient-specific anatomies. In this context, anatomies involve complex geometric topologies and tortuous 3D blood flow pathways with multiple inlets and outlets. These tools make it possible to freely deform the lumen surface and to bend and position baffles through real-time, direct manipulation of the 3D models with both hands, thus eliminating the tedious and time-consuming phase of entering the desired geometry using traditional computer-aided design (CAD) systems. The 3D models of the modified anatomies are seamlessly exported and meshed for patient-specific CFD analysis. Free-formed anatomical modifications are quantified using an in-house skeletization based cross-sectional geometry analysis tool. Hemodynamic performance of the systematically modified anatomies is compared with the original anatomy using CFD. CFD results showed the relative importance of the various surgically created features such as pouch size, vena cave to pulmonary artery (PA) flare and PA stenosis. An interactive surgical-patch size estimator is also introduced. The combined design/analysis cycle time is used for comparing and optimizing surgical plans and improvements are tabulated. The reduced cost of patient-specific shape design and analysis process, made it possible to envision large clinical studies to assess the validity of predictive patient-specific CFD simulations. In this paper, model

  7. AsmL Specification of a Ptolemy II Scheduler

    DEFF Research Database (Denmark)

    Lázaro Cuadrado, Daniel; Koch, Peter; Ravn, Anders Peter

    2003-01-01

    Ptolemy II is a tool that combines different computational models for simulation and design of embedded systems. AsmL is a software specification language based on the Abstract State Machine formalism. This paper reports on development of an AsmL model of the Synchronous Dataflow domain scheduler...

  8. Qualification of integrated tool environments (QUITE) for the development of computer-based safety systems in NPP

    International Nuclear Information System (INIS)

    Miedl, Horst

    2004-01-01

    In NPP I et C systems are back fitted meanwhile increasingly by computer-based systems (I et C platforms). The corresponding safety functions are implemented by software, and this software is developed, configured and administrated with the help of integrated tool environments (ITE). An ITE offers a set of services which are used to construct an I et C system and consist typically of software packages for project control and documentation, specification and design, automatic code generation and so on. Commercial ITE are not necessarily conceived and qualified (type-tested) for nuclear specific applications but are used - and will increasingly be used - for the implementation of nuclear safety related I et C systems. Therefor, it is necessary to qualify commercial ITE with respect to their influence on the quality of the target system for each I et C platform (dependent on the safety category of the target system). Examples for commercial ITEs are I et C platforms like SPINLINE 3, TELEPERM XP, Common Q, TRICON, etc. (Author)

  9. E-pharmacovigilance: development and implementation of a computable knowledge base to identify adverse drug reactions.

    Science.gov (United States)

    Neubert, Antje; Dormann, Harald; Prokosch, Hans-Ulrich; Bürkle, Thomas; Rascher, Wolfgang; Sojer, Reinhold; Brune, Kay; Criegee-Rieck, Manfred

    2013-09-01

    Computer-assisted signal generation is an important issue for the prevention of adverse drug reactions (ADRs). However, due to poor standardization of patients' medical data and a lack of computable medical drug knowledge the specificity of computerized decision support systems for early ADR detection is too low and thus those systems are not yet implemented in daily clinical practice. We report on a method to formalize knowledge about ADRs based on the Summary of Product Characteristics (SmPCs) and linking them with structured patient data to generate safety signals automatically and with high sensitivity and specificity. A computable ADR knowledge base (ADR-KB) that inherently contains standardized concepts for ADRs (WHO-ART), drugs (ATC) and laboratory test results (LOINC) was built. The system was evaluated in study populations of paediatric and internal medicine inpatients. A total of 262 different ADR concepts related to laboratory findings were linked to 212 LOINC terms. The ADR knowledge base was retrospectively applied to a study population of 970 admissions (474 internal and 496 paediatric patients), who underwent intensive ADR surveillance. The specificity increased from 7% without ADR-KB up to 73% in internal patients and from 19.6% up to 91% in paediatric inpatients, respectively. This study shows that contextual linkage of patients' medication data with laboratory test results is a useful and reasonable instrument for computer-assisted ADR detection and a valuable step towards a systematic drug safety process. The system enables automated detection of ADRs during clinical practice with a quality close to intensive chart review. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  10. Development of computational small animal models and their applications in preclinical imaging and therapy research

    NARCIS (Netherlands)

    Xie, Tianwu; Zaidi, Habib

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal

  11. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

    1987-01-01

    The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

  12. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis.

    Science.gov (United States)

    Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.

  13. Formal specification is an experimental science

    Energy Technology Data Exchange (ETDEWEB)

    Bjorner, D. [Technical Univ., Lyngby (Denmark)

    1992-09-01

    Traditionally, abstract models of large, complex systems have been given in free-form mathematics, combining - often in ad-hoc, not formally supported ways - notions from the disciplines of partial differential equations, functional analysis, mathematical statistics, etc. Such models have been very useful for assimilation of information, analysis (investigation), and prediction (simulation). These models have, however, usually not been helpful in deriving computer representations of the modelled systems - for the purposes of computerized monitoring and control, Computing science, concerned with how to construct objects that can exist within the computer, offers ways of complementing, and in some cases, replacing or combining traditional mathematical models. Formal, model-, as well as property-oriented, specifications in the styles of denotational (respectively, algebraic semantics) represent major approaches to such modelling. In this expository, discursive paper we illustrate what we mean by model-oriented specifications of large, complex technological computing systems. The three modelling examples covers the introvert programming methodological subject of SDEs: software development environments, the distributed computing system subject of wfs`s: (transaction) work flow systems, and the extrovert subject of robots: robotics! the thesis is, just as for mathematical modelling, that we can derive much understanding, etc., from experimentally creating such formally specified models - on paper - and that we gain little in additionally building ad-hoc prototypes. Our models are expressed in a model-oriented style using the VDM specification language Meta-IV In this paper the models only reflect the {open_quotes}data modelling{close_quotes} aspects. We observe that such data models are more easily captured in the model-oriented siyle than in the algebraic semantics property-oriented style which originally was built of the abstraction of operations. 101 refs., 4 figs.

  14. XML-Based Visual Specification of Multidisciplinary Applications

    Science.gov (United States)

    Al-Theneyan, Ahmed; Jakatdar, Amol; Mehrotra, Piyush; Zubair, Mohammad

    2001-01-01

    The advancements in the Internet and Web technologies have fueled a growing interest in developing a web-based distributed computing environment. We have designed and developed Arcade, a web-based environment for designing, executing, monitoring, and controlling distributed heterogeneous applications, which is easy to use and access, portable, and provides support through all phases of the application development and execution. A major focus of the environment is the specification of heterogeneous, multidisciplinary applications. In this paper we focus on the visual and script-based specification interface of Arcade. The web/browser-based visual interface is designed to be intuitive to use and can also be used for visual monitoring during execution. The script specification is based on XML to: (1) make it portable across different frameworks, and (2) make the development of our tools easier by using the existing freely available XML parsers and editors. There is a one-to-one correspondence between the visual and script-based interfaces allowing users to go back and forth between the two. To support this we have developed translators that translate a script-based specification to a visual-based specification, and vice-versa. These translators are integrated with our tools and are transparent to users.

  15. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  16. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  17. Refining VHDL Specifications Through Conformance Testing: Case Study of an Adaptive Computing Architecture

    National Research Council Canada - National Science Library

    Duale, Ali

    1999-01-01

    .... Such an integration will allow for the removal of costly mistakes from a specification at an early stage of the development process before they propagate into different implementations, possibly...

  18. Learning Support Assessment Study of a Computer Simulation for the Development of Microbial Identification Strategies

    Directory of Open Access Journals (Sweden)

    Tristan E. Johnson

    2009-12-01

    Full Text Available This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage—how the students used the software features and Problem-Solving Strategy Development—the strategy level students started with and the skill level they achieved when they completed their use of the simulation. Several conclusions emerged from the analysis of the data: (i The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii Identibacter (the computer simulation may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii The simulation characteristics did support student learning of a problem-solving strategy. (iv Students demonstrated problem-solving strategy development specific to bacterial identification. (v Participants demonstrated an improved performance from their repeated use of the simulation.

  19. Computational radiology for orthopaedic interventions

    CERN Document Server

    Li, Shuo

    2016-01-01

    This book provides a cohesive overview of the current technological advances in computational radiology, and their applications in orthopaedic interventions. Contributed by the leading researchers in the field, this volume covers not only basic computational radiology techniques such as statistical shape modeling, CT/MRI segmentation, augmented reality and micro-CT image processing, but also the applications of these techniques to various orthopaedic interventional tasks. Details about following important state-of-the-art development are featured: 3D preoperative planning and patient-specific instrumentation for surgical treatment of long-bone deformities, computer assisted diagnosis and planning of periacetabular osteotomy and femoroacetabular impingement, 2D-3D reconstruction-based planning of total hip arthroplasty, image fusion for  computer-assisted bone tumor surgery, intra-operative three-dimensional imaging in fracture treatment, augmented reality based orthopaedic interventions and education, medica...

  20. A Multidisciplinary Model for Development of Intelligent Computer-Assisted Instruction.

    Science.gov (United States)

    Park, Ok-choon; Seidel, Robert J.

    1989-01-01

    Proposes a schematic multidisciplinary model to help developers of intelligent computer-assisted instruction (ICAI) identify the types of required expertise and integrate them into a system. Highlights include domain types and expertise; knowledge acquisition; task analysis; knowledge representation; student modeling; diagnosis of learning needs;…

  1. Caltech computer scientists develop FAST protocol to speed up Internet

    CERN Multimedia

    2003-01-01

    "Caltech computer scientists have developed a new data transfer protocol for the Internet fast enough to download a full-length DVD movie in less than five seconds. The protocol is called FAST, standing for Fast Active queue management Scalable Transmission Control Protocol" (1 page).

  2. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  3. CIPSS [computer-integrated process and safeguards system]: The integration of computer-integrated manufacturing and robotics with safeguards, security, and process operations

    International Nuclear Information System (INIS)

    Leonard, R.S.; Evans, J.C.

    1987-01-01

    This poster session describes the computer-integrated process and safeguards system (CIPSS). The CIPSS combines systems developed for factory automation and automated mechanical functions (robots) with varying degrees of intelligence (expert systems) to create an integrated system that would satisfy current and emerging security and safeguards requirements. Specifically, CIPSS is an extension of the automated physical security functions concepts. The CIPSS also incorporates the concepts of computer-integrated manufacturing (CIM) with integrated safeguards concepts, and draws upon the Defense Advance Research Project Agency's (DARPA's) strategic computing program

  4. Handbook of graph grammars and computing by graph transformation

    CERN Document Server

    Engels, G; Kreowski, H J; Rozenberg, G

    1999-01-01

    Graph grammars originated in the late 60s, motivated by considerations about pattern recognition and compiler construction. Since then, the list of areas which have interacted with the development of graph grammars has grown quite impressively. Besides the aforementioned areas, it includes software specification and development, VLSI layout schemes, database design, modeling of concurrent systems, massively parallel computer architectures, logic programming, computer animation, developmental biology, music composition, visual languages, and many others.The area of graph grammars and graph tran

  5. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Developing an evidence-based curriculum designed to help psychiatric nurses learn to use computers and the Internet.

    Science.gov (United States)

    Koivunen, Marita; Välimäki, Maritta; Jakobsson, Tiina; Pitkänen, Anneli

    2008-01-01

    This article describes the systematic process in which an evidence-based approach was used to develop a curriculum designed to support the computer and Internet skills of nurses in psychiatric hospitals in Finland. The pressure on organizations to have skilled and motivated nurses who use modern information and communication technology in health care organizations has increased due to rapid technology development at the international and national levels. However, less frequently has the development of those computer education curricula been based on evidence-based knowledge. First, we identified psychiatric nurses' learning experiences and barriers to computer use by examining written essays. Second, nurses' computer skills were surveyed. Last, evidence from the literature was scrutinized to find effective methods that can be used to teach and learn computer use in health care. This information was integrated and used for the development process of an education curriculum designed to support nurses' computer and Internet skills.

  7. Development of Organ-Specific Donor Risk Indices

    OpenAIRE

    Akkina, Sanjeev K.; Asrani, Sumeet K.; Peng, Yi; Stock, Peter; Kim, Ray; Israni, Ajay K.

    2012-01-01

    Due to the shortage of deceased donor organs, transplant centers accept organs from marginal deceased donors, including older donors. Organ-specific donor risk indices have been developed to predict graft survival using various combinations of donor and recipient characteristics. We will review the kidney donor risk index (KDRI) and liver donor risk index (LDRI) and compare and contrast their strengths, limitations, and potential uses. The Kidney Donor Risk Index has a potential role in devel...

  8. Crossing the chasm: how to develop weather and climate models for next generation computers?

    Directory of Open Access Journals (Sweden)

    B. N. Lawrence

    2018-05-01

    Full Text Available Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.. However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism, the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities – perhaps based on existing efforts to develop

  9. GSTARS computer models and their applications, part I: theoretical development

    Science.gov (United States)

    Yang, C.T.; Simoes, F.J.M.

    2008-01-01

    GSTARS is a series of computer models developed by the U.S. Bureau of Reclamation for alluvial river and reservoir sedimentation studies while the authors were employed by that agency. The first version of GSTARS was released in 1986 using Fortran IV for mainframe computers. GSTARS 2.0 was released in 1998 for personal computer application with most of the code in the original GSTARS revised, improved, and expanded using Fortran IV/77. GSTARS 2.1 is an improved and revised GSTARS 2.0 with graphical user interface. The unique features of all GSTARS models are the conjunctive use of the stream tube concept and of the minimum stream power theory. The application of minimum stream power theory allows the determination of optimum channel geometry with variable channel width and cross-sectional shape. The use of the stream tube concept enables the simulation of river hydraulics using one-dimensional numerical solutions to obtain a semi-two- dimensional presentation of the hydraulic conditions along and across an alluvial channel. According to the stream tube concept, no water or sediment particles can cross the walls of stream tubes, which is valid for many natural rivers. At and near sharp bends, however, sediment particles may cross the boundaries of stream tubes. GSTARS3, based on FORTRAN 90/95, addresses this phenomenon and further expands the capabilities of GSTARS 2.1 for cohesive and non-cohesive sediment transport in rivers and reservoirs. This paper presents the concepts, methods, and techniques used to develop the GSTARS series of computer models, especially GSTARS3. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  10. Development of a Computer Code for the Estimation of Fuel Rod Failure

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I.H.; Ahn, H.J. [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    1997-12-31

    Much research has already been performed to obtain the information on the degree of failed fuel rods from the primary coolant activities of operating PWRs in the last few decades. The computer codes that are currently in use for domestic nuclear power plants, such as CADE code and ABB-CE codes developed by Westinghouse and ABB-CE, respectively, still give significant overall errors in estimating the failed fuel rods. In addition, with the CADE code, it is difficult to predict the degree of fuel rod failures during the transient period of nuclear reactor operation, where as the ABB-CE codes are relatively more difficult to use for end-users. In particular, the rapid progresses made recently in the area of the computer hardware and software systems that their computer programs be more versatile and user-friendly. While the MS windows system that is centered on the graphic user interface and multitasking is now in widespread use, the computer codes currently employed at the nuclear power plants, such as CADE and ABB-CE codes, can only be run on the DOS system. Moreover, it is desirable to have a computer code for the fuel rod failure estimation that can directly use the radioactivity data obtained from the on-line monitoring system of the primary coolant activity. The main purpose of this study is, therefore, to develop a Windows computer code that can predict the location, the number of failed fuel rods,and the degree of failures using the radioactivity data obtained from the primary coolant activity for PWRs. Another objective is to combine this computer code with the on-line monitoring system of the primary coolant radioactivity at Kori 3 and 4 operating nuclear power plants and enable their combined use for on-line evaluation of the number and degree of fuel rod failures. (author). 49 refs., 85 figs., 30 tabs.

  11. Development of a computer control system for the RCNP ring cyclotron

    International Nuclear Information System (INIS)

    Ogata, H.; Yamazaki, T.; Ando, A.; Hosono, K.; Itahashi, T.; Katayama, I.; Kibayashi, M.; Kinjo, S.; Kondo, M.; Miura, I.; Nagayama, K.; Noro, T.; Saito, T.; Shimizu, A.; Uraki, M.; Maruyama, M.; Aoki, K.; Yamada, S.; Kodaira, K.

    1990-01-01

    A hierarchically distributed computer control system for the RCNP ring cyclotron is being developed. The control system consists of a central computer and four subcomputers which are linked together by an Ethernet, universal device controllers which control component devices, man-machine interfaces including an operator console and interlock systems. The universal device controller is a standard single-board computer with an 8344 microcontroller and parallel interfaces, and is usually integrated into a component device and connected to a subcomputer by means of an optical-fiber cable to achieve high-speed data transfer. Control sequences for subsystems are easily produced and improved by using an interpreter language named OPELA (OPEration Language for Accelerators). The control system will be installed in March 1990. (orig.)

  12. 'Micro-8' micro-computer system

    International Nuclear Information System (INIS)

    Yagi, Hideyuki; Nakahara, Yoshinori; Yamada, Takayuki; Takeuchi, Norio; Koyama, Kinji

    1978-08-01

    The micro-computer Micro-8 system has been developed to organize a data exchange network between various instruments and a computer group including a large computer system. Used for packet exchangers and terminal controllers, the system consists of ten kinds of standard boards including a CPU board with INTEL-8080 one-chip-processor. CPU architecture, BUS architecture, interrupt control, and standard-boards function are explained in circuit block diagrams. Operations of the basic I/O device, digital I/O board and communication adapter are described with definitions of the interrupt ramp status, I/O command, I/O mask, data register, etc. In the appendixes are circuit drawings, INTEL-8080 micro-processor specifications, BUS connections, I/O address mappings, jumper connections of address selection, and interface connections. (author)

  13. Coordiantion by using Product Specifications in Product Development

    DEFF Research Database (Denmark)

    Terkelsen, Søren Bendix

    1997-01-01

    This paper is based on a case study. It treats the coordination by generating product specifications in product development. This paper contains three very important aspects, which cause a need for coordination, and call attention to the coordination mechanisms. The three aspects are task...... uncertainty, task complexity, and dependencies between activities. If one want to select coordination mechanisms, which improve the performance in product development, it is very important to have a knowledge about these three aspects. In the following the aspects are identified in the literature...

  14. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  15. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  16. Heat Transfer Computations of Internal Duct Flows With Combined Hydraulic and Thermal Developing Length

    Science.gov (United States)

    Wang, C. R.; Towne, C. E.; Hippensteele, S. A.; Poinsatte, P. E.

    1997-01-01

    This study investigated the Navier-Stokes computations of the surface heat transfer coefficients of a transition duct flow. A transition duct from an axisymmetric cross section to a non-axisymmetric cross section, is usually used to connect the turbine exit to the nozzle. As the gas turbine inlet temperature increases, the transition duct is subjected to the high temperature at the gas turbine exit. The transition duct flow has combined development of hydraulic and thermal entry length. The design of the transition duct required accurate surface heat transfer coefficients. The Navier-Stokes computational method could be used to predict the surface heat transfer coefficients of a transition duct flow. The Proteus three-dimensional Navier-Stokes numerical computational code was used in this study. The code was first studied for the computations of the turbulent developing flow properties within a circular duct and a square duct. The code was then used to compute the turbulent flow properties of a transition duct flow. The computational results of the surface pressure, the skin friction factor, and the surface heat transfer coefficient were described and compared with their values obtained from theoretical analyses or experiments. The comparison showed that the Navier-Stokes computation could predict approximately the surface heat transfer coefficients of a transition duct flow.

  17. Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Castro, Joseph Pete Jr. (; .); Giunta, Anthony Andrew

    2006-01-01

    Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and

  18. Cloud Computing Law

    CERN Document Server

    Millard, Christopher

    2013-01-01

    This book is about the legal implications of cloud computing. In essence, ‘the cloud’ is a way of delivering computing resources as a utility service via the internet. It is evolving very rapidly with substantial investments being made in infrastructure, platforms and applications, all delivered ‘as a service’. The demand for cloud resources is enormous, driven by such developments as the deployment on a vast scale of mobile apps and the rapid emergence of ‘Big Data’. Part I of this book explains what cloud computing is and how it works. Part II analyses contractual relationships between cloud service providers and their customers, as well as the complex roles of intermediaries. Drawing on primary research conducted by the Cloud Legal Project at Queen Mary University of London, cloud contracts are analysed in detail, including the appropriateness and enforceability of ‘take it or leave it’ terms of service, as well as the scope for negotiating cloud deals. Specific arrangements for public sect...

  19. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  20. Automating Commercial Video Game Development using Computational Intelligence

    OpenAIRE

    Tse G. Tan; Jason Teo; Patricia Anthony

    2011-01-01

    Problem statement: The retail sales of computer and video games have grown enormously during the last few years, not just in United States (US), but also all over the world. This is the reason a lot of game developers and academic researchers have focused on game related technologies, such as graphics, audio, physics and Artificial Intelligence (AI) with the goal of creating newer and more fun games. In recent years, there has been an increasing interest in game AI for pro...

  1. Quantum robots and quantum computers

    Energy Technology Data Exchange (ETDEWEB)

    Benioff, P.

    1998-07-01

    Validation of a presumably universal theory, such as quantum mechanics, requires a quantum mechanical description of systems that carry out theoretical calculations and systems that carry out experiments. The description of quantum computers is under active development. No description of systems to carry out experiments has been given. A small step in this direction is taken here by giving a description of quantum robots as mobile systems with on board quantum computers that interact with different environments. Some properties of these systems are discussed. A specific model based on the literature descriptions of quantum Turing machines is presented.

  2. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1992-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two- fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, poor load balancing will degrade efficiency on either vector or data parallel architectures when the data are organized according to spatial location. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. This document discusses why developers algorithms, such as a neural net representation, that do not exhibit algorithms, such as a neural net representation, that do not exhibit load-balancing problems

  3. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  4. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  5. Individual Stochastic Screening for the Development of Computer Graphics

    Directory of Open Access Journals (Sweden)

    Maja Turčić¹*

    2012-12-01

    Full Text Available With the emergence of new tools and media, art and design have developed into digital computer-generated works. This article presents a sequence of creating art graphics because their original authors have not published the procedures. The goal is to discover the mathematics of an image and the programming libretto with the purpose of organizing a structural base of computer graphics. We will elaborate the procedures used to produce graphics known throughout the history of art, but that are nowadays also found in design and security graphics. The results are closely related graphics obtained by changing parameters that initiate them. The aim is to control the graphics, i.e. to use controlled stochastic to achieve desired solutions. Since the artists from the past have never published the procedures of screening methods, their ideas have remained “only” the works of art. In this article we will present the development of the algorithm that, more or less successfully, simulates those screening solutions. It has been proven that mathematically defined graphical elements serve as screening elements. New technological and mathematical solutions are introduced in the reproduction with individual screening elements to be used in printing.

  6. Development of a grinding-specific performance test set-up

    DEFF Research Database (Denmark)

    Olesen, C. G.; Larsen, B. H.; Andresen, E. L.

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding...... demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding...... and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding...

  7. Development of a grinding-specific performance test set-up.

    Science.gov (United States)

    Olesen, C G; Larsen, B H; Andresen, E L; de Zee, M

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding performance.

  8. Bringing Mohamed to the Mountain: Situated Professional Development in a Ubiquitous Computing Classroom

    Science.gov (United States)

    Swan, Karen; Kratcoski, Annette; Mazzer, Pat; Schenker, Jason

    2005-01-01

    This article describes an ongoing situated professional development program in which teachers bring their intact classes for an extended stay in a ubiquitous computing environment equipped with a variety of state-of-the-art computing devices. The experience is unique in that it not only situates teacher learning about technology integration in…

  9. X-ray computed tomography imaging of a tumor with high sensitivity using gold nanoparticles conjugated to a cancer-specific antibody via polyethylene glycol chains on their surface

    Science.gov (United States)

    Nakagawa, Tomohiko; Gonda, Kohsuke; Kamei, Takashi; Cong, Liman; Hamada, Yoh; Kitamura, Narufumi; Tada, Hiroshi; Ishida, Takanori; Aimiya, Takuji; Furusawa, Naoko; Nakano, Yasushi; Ohuchi, Noriaki

    2016-01-01

    Contrast agents are often used to enhance the contrast of X-ray computed tomography (CT) imaging of tumors to improve diagnostic accuracy. However, because the iodine-based contrast agents currently used in hospitals are of low molecular weight, the agent is rapidly excreted from the kidney or moves to extravascular tissues through the capillary vessels, depending on its concentration gradient. This leads to nonspecific enhancement of contrast images for tissues. Here, we created gold (Au) nanoparticles as a new contrast agent to specifically image tumors with CT using an enhanced permeability and retention (EPR) effect. Au has a higher X-ray absorption coefficient than does iodine. Au nanoparticles were supported with polyethylene glycol (PEG) chains on their surface to increase the blood retention and were conjugated with a cancer-specific antibody via terminal PEG chains. The developed Au nanoparticles were injected into tumor-bearing mice, and the distribution of Au was examined with CT imaging, transmission electron microscopy, and elemental analysis using inductively coupled plasma optical emission spectrometry. The results show that specific localization of the developed Au nanoparticles in the tumor is affected by a slight difference in particle size and enhanced by the conjugation of a specific antibody against the tumor.

  10. 'Cloud computing' and clinical trials: report from an ECRIN workshop.

    Science.gov (United States)

    Ohmann, Christian; Canham, Steve; Danielyan, Edgar; Robertshaw, Steve; Legré, Yannick; Clivio, Luca; Demotes, Jacques

    2015-07-29

    Growing use of cloud computing in clinical trials prompted the European Clinical Research Infrastructures Network, a European non-profit organisation established to support multinational clinical research, to organise a one-day workshop on the topic to clarify potential benefits and risks. The issues that arose in that workshop are summarised and include the following: the nature of cloud computing and the cloud computing industry; the risks in using cloud computing services now; the lack of explicit guidance on this subject, both generally and with reference to clinical trials; and some possible ways of reducing risks. There was particular interest in developing and using a European 'community cloud' specifically for academic clinical trial data. It was recognised that the day-long workshop was only the start of an ongoing process. Future discussion needs to include clarification of trial-specific regulatory requirements for cloud computing and involve representatives from the relevant regulatory bodies.

  11. The Development of Computer-Aided Design for Electrical Equipment Selection and Arrangement of 10 Kv Switchgear

    Directory of Open Access Journals (Sweden)

    Chernaya Anastassiya

    2015-01-01

    Full Text Available The paper intends to give an overview of a computer-aided design program application. The research includes two main parts: the development of a computer-aided design for an appropriate switchgear selection and its arrangement in an indoor switchgear layout. Matlab program was used to develop a computer-aided design system. The use of this program considerably simplifies the selection and arrangement of 10 kV switchgear.

  12. Teachers' Support in Using Computers for Developing Students' Listening and Speaking Skills in Pre-Sessional English Courses

    Science.gov (United States)

    Zou, Bin

    2013-01-01

    Many computer-assisted language learning (CALL) studies have found that teacher direction can help learners develop language skills at their own pace on computers. However, many teachers still do not know how to provide support for students to use computers to reinforce the development of their language skills. Hence, more examples of CALL…

  13. Computer based workstation for development of software for high energy physics experiments

    International Nuclear Information System (INIS)

    Ivanchenko, I.M.; Sedykh, Yu.V.

    1987-01-01

    Methodical principles and results of a successful attempt to create on the base of IBM-PC/AT personal computer of effective means for development of programs for high energy physics experiments are analysed. The obtained results permit to combine the best properties and a positive materialized experience accumulated on the existing time sharing collective systems with a high quality of data representation, reliability and convenience of personal computer applications

  14. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    Science.gov (United States)

    2014-04-01

    Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation ; or convey...multicore PDSP platforms. The GPU- based capabilities of TDIF are currently oriented towards NVIDIA GPUs, based on the Compute Unified Device Architecture...CUDA) programming language [ NVIDIA 2007], which can be viewed as an extension of C. The multicore PDSP capabilities currently in TDIF are oriented

  15. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  16. 1st International Conference on Communication and Computer Engineering

    CERN Document Server

    Othman, Mohd; Othman, Mohd; Rahim, Yahaya; Pee, Naim

    2015-01-01

    This book covers diverse aspects of advanced computer and communication engineering, focusing specifically on industrial and manufacturing theory and applications of electronics, communications, computing, and information technology. Experts in research, industry, and academia present the latest developments in technology, describe applications involving cutting-edge communication and computer systems, and explore likely future directions. In addition, access is offered to numerous new algorithms that assist in solving computer and communication engineering problems. The book is based on presentations delivered at ICOCOE 2014, the 1st International Conference on Communication and Computer Engineering. It will appeal to a wide range of professionals in the field, including telecommunication engineers, computer engineers and scientists, researchers, academics, and students.

  17. 2nd International Conference on Communication and Computer Engineering

    CERN Document Server

    Othman, Mohd; Othman, Mohd; Rahim, Yahaya; Pee, Naim

    2016-01-01

    This book covers diverse aspects of advanced computer and communication engineering, focusing specifically on industrial and manufacturing theory and applications of electronics, communications, computing and information technology. Experts in research, industry, and academia present the latest developments in technology, describe applications involving cutting-edge communication and computer systems, and explore likely future trends. In addition, a wealth of new algorithms that assist in solving computer and communication engineering problems are presented. The book is based on presentations given at ICOCOE 2015, the 2nd International Conference on Communication and Computer Engineering. It will appeal to a wide range of professionals in the field, including telecommunication engineers, computer engineers and scientists, researchers, academics and students.

  18. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    International Nuclear Information System (INIS)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants

  19. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  20. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  1. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  2. Exploring the Use of Computer Simulations in Unraveling Research and Development Governance Problems

    Science.gov (United States)

    Balaban, Mariusz A.; Hester, Patrick T.

    2012-01-01

    Understanding Research and Development (R&D) enterprise relationships and processes at a governance level is not a simple task, but valuable decision-making insight and evaluation capabilities can be gained from their exploration through computer simulations. This paper discusses current Modeling and Simulation (M&S) methods, addressing their applicability to R&D enterprise governance. Specifically, the authors analyze advantages and disadvantages of the four methodologies used most often by M&S practitioners: System Dynamics (SO), Discrete Event Simulation (DES), Agent Based Modeling (ABM), and formal Analytic Methods (AM) for modeling systems at the governance level. Moreover, the paper describes nesting models using a multi-method approach. Guidance is provided to those seeking to employ modeling techniques in an R&D enterprise for the purposes of understanding enterprise governance. Further, an example is modeled and explored for potential insight. The paper concludes with recommendations regarding opportunities for concentration of future work in modeling and simulating R&D governance relationships and processes.

  3. Building Real World Domain-Specific Social Network Websites as a Capstone Project

    Science.gov (United States)

    Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny

    2009-01-01

    This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…

  4. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  5. Prediction of intestinal absorption and blood-brain barrier penetration by computational methods.

    Science.gov (United States)

    Clark, D E

    2001-09-01

    This review surveys the computational methods that have been developed with the aim of identifying drug candidates likely to fail later on the road to market. The specifications for such computational methods are outlined, including factors such as speed, interpretability, robustness and accuracy. Then, computational filters aimed at predicting "drug-likeness" in a general sense are discussed before methods for the prediction of more specific properties--intestinal absorption and blood-brain barrier penetration--are reviewed. Directions for future research are discussed and, in concluding, the impact of these methods on the drug discovery process, both now and in the future, is briefly considered.

  6. A personal computer code for seismic evaluations of nuclear power plants facilities

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Graves, H.

    1990-01-01

    The program CARES (Computer Analysis for Rapid Evaluation of Structures) is an integrated computational system being developed by Brookhaven National Laboratory (BNL) for the U.S. Nuclear Regulatory Commission. It is specifically designed to be a personal computer (PC) operated package which may be used to determine the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants. CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the various features which have been implemented into the Seismic Module of CARES

  7. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    Science.gov (United States)

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  8. Software development for specific geometry and safe design of isotropic material multicell beams

    International Nuclear Information System (INIS)

    Tariq, M.M.; Ahmed, M.A.

    2011-01-01

    Comparison of analytical results with finite element results for analysis of isotropic material multicell beams subjected to free torsion case is the main idea of this paper. Progress in the fundamentals and applications of advanced materials and their processing technologies involves costly experiments and prototype testing for reliability. The software development for design analysis of structures with advanced materials is a low cost but challenging research. Multicell beams have important industrial applications in the aerospace and automotive sectors. This paper explains software development to test different materials in design of a multicell beam. Objective of this paper is to compute the torsional loading of multicell beams of isotropic materials for safe design in both symmetrical and asymmetrical geometries. Software has been developed in Microsoft Visual Basic. Distribution of Saint Venant shear flows, shear stresses, factors of safety, volume, mass, weight, twist, polar moment of inertia and aspect ratio for free torsion in multicell beam have been calculated using this software. The software works on four algorithms, these are, Specific geometry algorithm, material selection algorithm, factor of safety algorithm and global algorithm. User can specify new materials analytically, or choose a pre-defined material from the list, which includes, plain carbon steels, low alloy steels, stainless steels, cast irons, aluminum alloys, copper alloys, magnesium alloys, titanium alloys, precious metals and refractory metals. Although this software is restricted to multicell beam comprising of three cells, however future versions can have ability to address more complicated shapes and cases of multicell beams. Software also describes nomenclature and mathematical formulas applied to help user understand the theoretical background. User can specify geometry of multicell beam for three rectangular cells. Software computes shear flows, shear stresses, safety factors

  9. Rapid mental computation system as a tool for algorithmic thinking of elementary school students development

    OpenAIRE

    Ziatdinov, Rushan; Musa, Sajid

    2013-01-01

    In this paper, we describe the possibilities of using a rapid mental computation system in elementary education. The system consists of a number of readily memorized operations that allow one to perform arithmetic computations very quickly. These operations are actually simple algorithms which can develop or improve the algorithmic thinking of pupils. Using a rapid mental computation system allows forming the basis for the study of computer science in secondary school.

  10. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  11. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  12. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  13. Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding

    International Nuclear Information System (INIS)

    Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis

    2003-01-01

    A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)

  14. Computational modeling of the mathematical dummy of the Brazilian woman for calculations of internal dosimetry and ends of comparison of the fractions absorbed specific with the woman reference

    International Nuclear Information System (INIS)

    Ximenes, Edmir

    2006-01-01

    Tools for dosimetric calculations are of the utmost importance for the basic principles of radiological protection, not only in nuclear medicine, but also in other scientific calculations. In this work a mathematical model of the Brazilian woman is developed in order to be used as a basis for calculations of Specific Absorbed Fractions (SAFs) in internal organs and in the skeleton, in accord with the objectives of diagnosis or therapy in nuclear medicine. The model developed here is similar in form to that of Snyder, but modified to be more relevant to the case of the Brazilian woman. To do this, the formalism of the Monte Carlo method was used by means of the ALGAM- 97 R computational code. As a contribution to the objectives of this thesis, we developed the computational system cSAF - consultation for Specific Absorbed Fractions (cFAE from Portuguese acronym) - which furnishes several 'look-up' facilities for the research user. The dialogue interface with the operator was planned following current practices in the utilization of event-oriented languages. This interface permits the user to navigate by means of the reference models, choose the source organ, the energy desired, and receive an answer through an efficient and intuitive dialogue. The system furnishes, in addition to the data referring to the Brazilian woman, data referring to the model of Snyder and to the model of the Brazilian man. The system makes available not only individual data to the SAFs of the three models, but also a comparison among them. (author)

  15. Development and validation of the computer technology literacy self-assessment scale for Taiwanese elementary school students.

    Science.gov (United States)

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the attitudes toward computer technology, the learning with technology, and the Internet operation skills. Participants were 1,539 elementary school students in Taiwan. Data analysis indicated that the instrument developed in the study had satisfactory validity and reliability. Correlations analysis supported the legitimacy of using multiple dimensions in representing students' computer technology literacy. Significant differences were found between male and female students, and between grades on some CTLS dimensions. Suggestions are made for use of the instrument to examine complicated interplays between students' computer behaviors and their computer technology literacy.

  16. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    International Nuclear Information System (INIS)

    Maynard, Matthew R; Geyer, John W; Bolch, Wesley; Aris, John P; Shifrin, Roger Y

    2011-01-01

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in

  17. Development of hydrogen combustion analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Tae Jin; Lee, K. D.; Kim, S. N. [Soongsil University, Seoul (Korea, Republic of); Hong, J. S.; Kwon, H. Y. [Seoul National Polytechnic University, Seoul (Korea, Republic of); Kim, Y. B.; Kim, J. S. [Seoul National University, Seoul (Korea, Republic of)

    1997-07-01

    The objectives of this project is to construct a credible DB for component reliability by developing methodologies and computer codes for assessing component independent failure and common cause failure probability, incorporating applicability and dependency of the data. In addition to this, the ultimate goal is to systematize all the analysis procedures so as to provide plans for preventing component failures by employing flexible tools for the change of specific plant or data sources. For the first subject, we construct a DB for similarity index and dependence matrix and propose a systematic procedure for data analysis by investigating the similarity and redundancy of the generic data sources. Next, we develop a computer code for this procedure and construct reliability data base for major components. The second subject is focused on developing CCF procedure for assessing the plant specific defense ability, rather than developing another CCF model. We propose a procedure and computer code for estimating CCF event probability by incorporating plant specific defensive measure. 116 refs., 25 tabs., 24 figs. (author)

  18. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  19. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  20. Patient-specific reconstruction plates are the missing link in computer-assisted mandibular reconstruction: A showcase for technical description.

    Science.gov (United States)

    Cornelius, Carl-Peter; Smolka, Wenko; Giessler, Goetz A; Wilde, Frank; Probst, Florian A

    2015-06-01

    Preoperative planning of mandibular reconstruction has moved from mechanical simulation by dental model casts or stereolithographic models into an almost completely virtual environment. CAD/CAM applications allow a high level of accuracy by providing a custom template-assisted contouring approach for bone flaps. However, the clinical accuracy of CAD reconstruction is limited by the use of prebent reconstruction plates, an analogue step in an otherwise digital workstream. In this paper the integration of computerized, numerically-controlled (CNC) milled, patient-specific mandibular plates (PSMP) within the virtual workflow of computer-assisted mandibular free fibula flap reconstruction is illustrated in a clinical case. Intraoperatively, the bone segments as well as the plate arms showed a very good fit. Postoperative CT imaging demonstrated close approximation of the PSMP and fibular segments, and good alignment of native mandible and fibular segments and intersegmentally. Over a follow-up period of 12 months, there was an uneventful course of healing with good bony consolidation. The virtual design and automated fabrication of patient-specific mandibular reconstruction plates provide the missing link in the virtual workflow of computer-assisted mandibular free fibula flap reconstruction. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.