WorldWideScience

Sample records for specifically computer development

  1. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  2. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn

    2009-01-01

    An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design that add......An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design...... that addresses the shaping of interactions among people, place-specific resources and global socio-technical networks, mediated by digital technology, and influenced by the structuring conditions of place. The theoretical grounding for place-specific computing is located in the meeting between conceptions...... of place in human geography and recent research in interaction design focusing on embodied interaction. Central themes in this grounding revolve around place and its relation to embodiment and practice, as well as the social, cultural and material aspects conditioning the enactment of place. Selected...

  3. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  4. Development of a computationally-designed polymeric adsorbent specific for mycotoxin patulin.

    Science.gov (United States)

    Piletska, Elena V; Pink, Demi; Karim, Kal; Piletsky, Sergey A

    2017-12-04

    Patulin is a toxic compound which is found predominantly in apples affected by mould rot. Since apples and apple-containing products are a popular food for the elderly, children and babies, the monitoring of the toxin is crucial. This paper describes a development of a computationally-designed polymeric adsorbent for the solid-phase extraction of patulin, which provides an effective clean-up of the food samples and allows the detection and accurate quantification of patulin levels present in apple juice using conventional chromatography methods. The developed bespoke polymer demonstrates a quantitative binding towards the patulin present in undiluted apple juice. The polymer is inexpensive and easy to mass-produce. The contributing factors to the function of the adsorbent is a combination of acidic and basic functional monomers producing a zwitterionic complex in the solution that formed stronger binding complexes with the patulin molecule. The protocols described in this paper provide a blueprint for the development of polymeric adsorbents for other toxins or different food matrices.

  5. Design and development of semantic web-based system for computer science domain-specific information retrieval

    Directory of Open Access Journals (Sweden)

    Ritika Bansal

    2016-09-01

    Full Text Available In semantic web-based system, the concept of ontology is used to search results by contextual meaning of input query instead of keyword matching. From the research literature, there seems to be a need for a tool which can provide an easy interface for complex queries in natural language that can retrieve the domain-specific information from the ontology. This research paper proposes an IRSCSD system (Information retrieval system for computer science domain as a solution. This system offers advanced querying and browsing of structured data with search results automatically aggregated and rendered directly in a consistent user-interface, thus reducing the manual effort of users. So, the main objective of this research is design and development of semantic web-based system for integrating ontology towards domain-specific retrieval support. Methodology followed is a piecemeal research which involves the following stages. First Stage involves the designing of framework for semantic web-based system. Second stage builds the prototype for the framework using Protégé tool. Third Stage deals with the natural language query conversion into SPARQL query language using Python-based QUEPY framework. Fourth Stage involves firing of converted SPARQL queries to the ontology through Apache's Jena API to fetch the results. Lastly, evaluation of the prototype has been done in order to ensure its efficiency and usability. Thus, this research paper throws light on framework development for semantic web-based system that assists in efficient retrieval of domain-specific information, natural language query interpretation into semantic web language, creation of domain-specific ontology and its mapping with related ontology. This research paper also provides approaches and metrics for ontology evaluation on prototype ontology developed to study the performance based on accessibility of required domain-related information.

  6. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han; Yang, Yong Liang; Bao, Fan; Fink, Daniel; Yan, Dongming; Wonka, Peter; Mitra, Niloy J.

    2016-01-01

    of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications

  7. Optoelectronic Computer Architecture Development for Image Reconstruction

    National Research Council Canada - National Science Library

    Forber, Richard

    1996-01-01

    .... Specifically, we collaborated with UCSD and ERIM on the development of an optically augmented electronic computer for high speed inverse transform calculations to enable real time image reconstruction...

  8. Specifics of computer discourse translation from English into Russian

    African Journals Online (AJOL)

    Specifics of computer discourse translation from English into Russian. ... of further development of science and technology in Russia and abroad and it inevitably ... The article may be useful for IT teachers when preparing teaching aids and ...

  9. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  10. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident

    International Nuclear Information System (INIS)

    Vazquez, Justin A; Caracappa, Peter F; Xu, X George

    2014-01-01

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools. (paper)

  11. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han

    2016-07-11

    Connectivity and layout of underlying networks largely determine agent behavior and usage in many environments. For example, transportation networks determine the flow of traffic in a neighborhood, whereas building floorplans determine the flow of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications. Such specifications can be in the form of network density, travel time versus network length, traffic type, destination location, etc. We propose an integer programming-based approach that guarantees that the resultant networks are valid by fulfilling all the specified hard constraints and that they score favorably in terms of the objective function. We evaluate our algorithm in two different design settings, street layout and floorplans to demonstrate that diverse networks can emerge purely from high-level functional specifications.

  12. Investigation of hemodynamics in the development of dissecting aneurysm within patient-specific dissecting aneurismal aortas using computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Tse, Kwong Ming; Chiu, Peixuan; Lee, Heow Pueh; Ho, Pei

    2011-03-15

    Aortic dissecting aneurysm is one of the most catastrophic cardiovascular emergencies that carries high mortality. It was pointed out from clinical observations that the aneurysm development is likely to be related to the hemodynamics condition of the dissected aorta. In order to gain more insight on the formation and progression of dissecting aneurysm, hemodynamic parameters including flow pattern, velocity distribution, aortic wall pressure and shear stress, which are difficult to measure in vivo, are evaluated using numerical simulations. Pulsatile blood flow in patient-specific dissecting aneurismal aortas before and after the formation of lumenal aneurysm (pre-aneurysm and post-aneurysm) is investigated by computational fluid dynamics (CFD) simulations. Realistic time-dependent boundary conditions are prescribed at various arteries of the complete aorta models. This study suggests the helical development of false lumen around true lumen may be related to the helical nature of hemodynamic flow in aorta. Narrowing of the aorta is responsible for the massive recirculation in the poststenosis region in the lumenal aneurysm development. High pressure difference of 0.21 kPa between true and false lumens in the pre-aneurismal aorta infers the possible lumenal aneurysm site in the descending aorta. It is also found that relatively high time-averaged wall shear stress (in the range of 4-8 kPa) may be associated with tear initiation and propagation. CFD modeling assists in medical planning by providing blood flow patterns, wall pressure and wall shear stress. This helps to understand various phenomena in the development of dissecting aneurysm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Development of an organ-specific insert phantom generated using a 3D printer for investigations of cardiac computed tomography protocols.

    Science.gov (United States)

    Abdullah, Kamarul A; McEntee, Mark F; Reed, Warren; Kench, Peter L

    2018-04-30

    An ideal organ-specific insert phantom should be able to simulate the anatomical features with appropriate appearances in the resultant computed tomography (CT) images. This study investigated a 3D printing technology to develop a novel and cost-effective cardiac insert phantom derived from volumetric CT image datasets of anthropomorphic chest phantom. Cardiac insert volumes were segmented from CT image datasets, derived from an anthropomorphic chest phantom of Lungman N-01 (Kyoto Kagaku, Japan). These segmented datasets were converted to a virtual 3D-isosurface of heart-shaped shell, while two other removable inserts were included using computer-aided design (CAD) software program. This newly designed cardiac insert phantom was later printed by using a fused deposition modelling (FDM) process via a Creatbot DM Plus 3D printer. Then, several selected filling materials, such as contrast media, oil, water and jelly, were loaded into designated spaces in the 3D-printed phantom. The 3D-printed cardiac insert phantom was positioned within the anthropomorphic chest phantom and 30 repeated CT acquisitions performed using a multi-detector scanner at 120-kVp tube potential. Attenuation (Hounsfield Unit, HU) values were measured and compared to the image datasets of real-patient and Catphan ® 500 phantom. The output of the 3D-printed cardiac insert phantom was a solid acrylic plastic material, which was strong, light in weight and cost-effective. HU values of the filling materials were comparable to the image datasets of real-patient and Catphan ® 500 phantom. A novel and cost-effective cardiac insert phantom for anthropomorphic chest phantom was developed using volumetric CT image datasets with a 3D printer. Hence, this suggested the printing methodology could be applied to generate other phantoms for CT imaging studies. © 2018 The Authors. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical

  14. Preliminary application of computer-assisted patient-specific acetabular navigational template for total hip arthroplasty in adult single development dysplasia of the hip.

    Science.gov (United States)

    Zhang, Yuan Z; Chen, Bin; Lu, Sheng; Yang, Yong; Zhao, Jian M; Liu, Rui; Li, Yan B; Pei, Guo X

    2011-12-01

    The considerable variation in anatomical abnormalities of hip joints associated with different types of developmental dysplasia of hip (DDH) makes reconstruction in total hip arthroplasty (THA) difficult. It is desirable to create patient-specific designs for THA procedures. In the cases of adult single DDH, an accuracy-improved method has been developed for acetabular cup prosthesis implantation of hip arthroplasty. From October 2007 to November 2008, 22 patients with single DDH (according to the Crowe standard, all dysplasia hips were classified as type I) were scanned with spiral CT pre-operatively. These patients scheduled for THA were randomly assigned to undergo either conventional THA (control group, n = 11) or navigation template implantation (NT group, n = 11). In the NT group, three-dimensional (3D) CT pelvis image data were transferred to a computer workstation and 3D models of the hip were reconstructed using the Mimics software. The 3D models were then processed by the Imageware software. In brief, a template that best fitted the location and shape of the acetabular cup was 'reversely' built from the 3D model, the rotation centre of the pathological hip determined by mirroring that of the healthy site, and a guiding hole in the template was then designed. The navigational templates were manufactured using a rapid prototyping machine. These navigation templates guide acetabular component placement. Based on the predetermined abduction angle 45° and anteversion angle 18°, after 1 year follow-up, the NT group showed significantly smaller differences (1.6° ± 0.4°, 1.9° ± 1.1°) from the predetermined angles than those in the control group (5.8° ± 2.9°, 3.9° ± 2.5°) (P < 0.05). The template designs facilitated accurate placement of acetabular components in dysplasia of acetabulum. The hip's center of rotation in DDH could be established using computer-aided design, which provides a useful method for the accurate

  15. A Computational Methodology to Overcome the Challenges Associated With the Search for Specific Enzyme Targets to Develop Drugs Against Leishmania major.

    Science.gov (United States)

    Catharina, Larissa; Lima, Carlyle Ribeiro; Franca, Alexander; Guimarães, Ana Carolina Ramos; Alves-Ferreira, Marcelo; Tuffery, Pierre; Derreumaux, Philippe; Carels, Nicolas

    2017-01-01

    We present an approach for detecting enzymes that are specific of Leishmania major compared with Homo sapiens and provide targets that may assist research in drug development. This approach is based on traditional techniques of sequence homology comparison by similarity search and Markov modeling; it integrates the characterization of enzymatic functionality, secondary and tertiary protein structures, protein domain architecture, and metabolic environment. From 67 enzymes represented by 42 enzymatic activities classified by AnEnPi (Analogous Enzymes Pipeline) as specific for L major compared with H sapiens , only 40 (23 Enzyme Commission [EC] numbers) could actually be considered as strictly specific of L major and 27 enzymes (19 EC numbers) were disregarded for having ambiguous homologies or analogies with H sapiens . Among the 40 strictly specific enzymes, we identified sterol 24-C-methyltransferase, pyruvate phosphate dikinase, trypanothione synthetase, and RNA-editing ligase as 4 essential enzymes for L major that may serve as targets for drug development.

  16. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Byamukama, Abdul [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Haiyong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries.

  17. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    International Nuclear Information System (INIS)

    Byamukama, Abdul; Jung, Haiyong

    2014-01-01

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries

  18. Computer-aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.

    1997-12-16

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP).

  19. Computer-Aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1996-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This system is defined as a Commercial-Off the-Shelf computer dispatching system providing both text and graphical display information while interfacing with the diverse reporting system within the Hanford Facility. This system also provided expansion capabilities to integrate Hanford Fire and the Occurrence Notification Center and provides back-up capabilities for the Plutonium Processing Facility

  20. Computer-aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1997-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP)

  1. Development of (99m)Tc-labeled asymmetric urea derivatives that target prostate-specific membrane antigen for single-photon emission computed tomography imaging.

    Science.gov (United States)

    Kimura, Hiroyuki; Sampei, Sotaro; Matsuoka, Daiko; Harada, Naoya; Watanabe, Hiroyuki; Arimitsu, Kenji; Ono, Masahiro; Saji, Hideo

    2016-05-15

    Prostate-specific membrane antigen (PSMA) is expressed strongly in prostate cancers and is, therefore, an attractive diagnostic and radioimmunotherapeutic target. In contrast to previous reports of PMSA-targeting (99m)Tc-tricarbonyl complexes that are cationic or lack a charge, no anionic (99m)Tc-tricarbonyl complexes have been reported. Notably, the hydrophilicity conferred by both cationic and anionic charges leads to rapid hepatobiliary clearance, whereas an anionic charge might better enhance renal clearance relative to a cationic charge. Therefore, an improvement in rapid clearance would be expected with either cationic or anionic charges, particularly anionic charges. In this study, we designed and synthesized a novel anionic (99m)Tc-tricarbonyl complex ([(99m)Tc]TMCE) and evaluated its use as a single-photon emission computed tomography (SPECT) imaging probe for PSMA detection. Direct synthesis of [(99m)Tc]TMCE from dimethyl iminodiacetate, which contains both the asymmetric urea and succinimidyl moiety important for PSMA binding, was performed using our microwave-assisted one-pot procedure. The chelate formation was successfully achieved even though the precursor included a complicated bioactive moiety. The radiochemical yield of [(99m)Tc]TMCE was 12-17%, with a radiochemical purity greater than 98% after HPLC purification. [(99m)Tc]TMCE showed high affinity in vitro, with high accumulation in LNCaP tumors and low hepatic retention in biodistribution and SPECT/CT studies. These findings warrant further evaluation of [(99m)Tc]TMCE as an imaging agent and support the benefit of this strategy for the design of other PSMA imaging probes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Computer-Aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.

    1996-09-27

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This document outlines the negotiated requirements as agreed to by GTE Northwest during technical contract discussions. This system defines a commercial off-the-shelf computer dispatching system providing both test and graphic display information while interfacing with diverse alarm reporting system within the Hanford Site. This system provided expansion capability to integrate Hanford Fire and the Occurrence Notification Center. The system also provided back-up capability for the Plutonium Processing Facility (PFP).

  3. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  4. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  5. New developments in the CREAM Computing Element

    International Nuclear Information System (INIS)

    Andreetto, Paolo; Bertocco, Sara; Dorigo, Alvise; Capannini, Fabio; Cecchi, Marco; Zangrando, Luigi

    2012-01-01

    The EU-funded project EMI aims at providing a unified, standardized, easy to install software for distributed computing infrastructures. CREAM is one of the middleware products part of the EMI middleware distribution: it implements a Grid job management service which allows the submission, management and monitoring of computational jobs to local resource management systems. In this paper we discuss about some new features being implemented in the CREAM Computing Element. The implementation of the EMI Execution Service (EMI-ES) specification (an agreement in the EMI consortium on interfaces and protocols to be used in order to enable computational job submission and management required across technologies) is one of the new functions being implemented. New developments are also focusing in the High Availability (HA) area, to improve performance, scalability, availability and fault tolerance.

  6. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  7. Embedded Volttron specification - benchmarking small footprint compute device for Volttron

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Woodworth, Ken [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kuruganti, Teja [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-17

    An embedded system is a small footprint computing unit that typically serves a specific purpose closely associated with measurements and control of hardware devices. These units are designed for reasonable durability and operations in a wide range of operating conditions. Some embedded systems support real-time operations and can demonstrate high levels of reliability. Many have failsafe mechanisms built to handle graceful shutdown of the device in exception conditions. The available memory, processing power, and network connectivity of these devices are limited due to the nature of their specific-purpose design and intended application. Industry practice is to carefully design the software for the available hardware capability to suit desired deployment needs. Volttron is an open source agent development and deployment platform designed to enable researchers to interact with devices and appliances without having to write drivers themselves. Hosting Volttron on small footprint embeddable devices enables its demonstration for embedded use. This report details the steps required and the experience in setting up and running Volttron applications on three small footprint devices: the Intel Next Unit of Computing (NUC), the Raspberry Pi 2, and the BeagleBone Black. In addition, the report also details preliminary investigation of the execution performance of Volttron on these devices.

  8. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  9. One Head Start Classroom's Experience: Computers and Young Children's Development.

    Science.gov (United States)

    Fischer, Melissa Anne; Gillespie, Catherine Wilson

    2003-01-01

    Contends that early childhood educators need to understand how exposure to computers and constructive computer programs affects the development of children. Specifically examines: (1) research on children's technology experiences; (2) determining best practices; and (3) addressing educators' concerns about computers replacing other developmentally…

  10. Recent Developments in Computed Tomography

    International Nuclear Information System (INIS)

    Braunstein, D.; Dafni, E.; Levene, S.; Malamud, G.; Shapiro, O.; Shechter, G.; Zahavi, O.

    1999-01-01

    Computerized Tomography. has become, during the past few years, one of the mostly used apparatus in X-ray diagnosis. Its clinical applications has penetrated to various fields, like operational guidance, cardiac imaging, computer aided surgery etc. The first second-generation CT scanners consisted of a rotate-rotate system detectors array and an X-ray tube. These scanners were capable of acquiring individual single slices, the duration of each being several seconds. The slow scanning rate, and the then poor computers power, limited the application range of these scanners, to relatively stable organs, short body coverage at given resolutions. Further drawbacks of these machines were weak X-ray sources and low efficiency gas detectors. In the late 80's the first helical scanners were introduced by Siemens. Based on a continuous patient couch movement during gantry rotation, much faster scans could be obtained, increasing significantly the volume coverage at a given time. In 1992 the first dual-slice scanners, equipped with high efficiency solid state detectors were introduced by Elscint. The acquisition of data simultaneously from two detector arrays doubled the efficiency of the scan. Faster computers and stronger X-ray sources further improved the performance, allowing for a new range of clinical applications. Yet, the need for even faster machines and bigger volume coverage led to further R and D efforts by the leading CT manufacturers. In order to accomplish the most demanding clinical needs, innovative 2 dimensional 4-rows solid-state detector arrays were developed, together with faster rotating machines and bigger X-ray tubes, all demanding extremely accurate and robust mechanical constructions. Parallel, multi-processor custom computers were made, in order to allow the on-line reconstruction of the growing amounts of raw data. Four-slice helical scanners, rotating at 0.5 sec per cycle are being tested nowadays in several clinics all over the world. This talk

  11. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  12. Computational biomechanics for medicine fundamental science and patient-specific applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2014-01-01

    One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This latest installment comprises nine of the latest developments in both fundamental science and patient-specific applications, from researchers in Australia, New Zealand, USA, UK, France, Ireland, and China. Some of the interesting topics discussed are: cellular mechanics; tumor growth and modeling; medical image analysis; and both patient-specific fluid dynamics and solid mechanics simulations.

  13. Computational identification of strain-, species- and genus-specific proteins

    Directory of Open Access Journals (Sweden)

    Thiagarajan Rathi

    2005-11-01

    Full Text Available Abstract Background The identification of unique proteins at different taxonomic levels has both scientific and practical value. Strain-, species- and genus-specific proteins can provide insight into the criteria that define an organism and its relationship with close relatives. Such proteins can also serve as taxon-specific diagnostic targets. Description A pipeline using a combination of computational and manual analyses of BLAST results was developed to identify strain-, species-, and genus-specific proteins and to catalog the closest sequenced relative for each protein in a proteome. Proteins encoded by a given strain are preliminarily considered to be unique if BLAST, using a comprehensive protein database, fails to retrieve (with an e-value better than 0.001 any protein not encoded by the query strain, species or genus (for strain-, species- and genus-specific proteins respectively, or if BLAST, using the best hit as the query (reverse BLAST, does not retrieve the initial query protein. Results are manually inspected for homology if the initial query is retrieved in the reverse BLAST but is not the best hit. Sequences unlikely to retrieve homologs using the default BLOSUM62 matrix (usually short sequences are re-tested using the PAM30 matrix, thereby increasing the number of retrieved homologs and increasing the stringency of the search for unique proteins. The above protocol was used to examine several food- and water-borne pathogens. We find that the reverse BLAST step filters out about 22% of proteins with homologs that would otherwise be considered unique at the genus and species levels. Analysis of the annotations of unique proteins reveals that many are remnants of prophage proteins, or may be involved in virulence. The data generated from this study can be accessed and further evaluated from the CUPID (Core and Unique Protein Identification system web site (updated semi-annually at http://pir.georgetown.edu/cupid. Conclusion CUPID

  14. Cloud Computing and Agile Organization Development

    Directory of Open Access Journals (Sweden)

    Bogdan GHILIC-MICU

    2014-01-01

    Full Text Available In the 3rd millennium economy, defined by globalization and continuous reduction of natural resources, the economic organization becomes the main actor in the phenomenon of transfor-mation and adaptation to new conditions. Even more, the economic environment, which is closely related to the social environment, undergoes complex metamorphoses, especially in the management area. In this dynamic and complex social and environmental context, the econom-ic organization must possess the ability to adapt, becoming a flexible and agile answer to new market opportunities. Considering the spectacular evolution of information and communica-tions technology, one of the solutions to ensure organization agility is cloud computing. Just like the development of any science requires adaptation to theories and instruments specific to other fields, a cloud computing paradigm for the agile organization must appeal to models from management, cybernetics, mathematics, structuralism and information theory (or information systems theory.

  15. The specification of Stampi, a message passing library for distributed parallel computing

    International Nuclear Information System (INIS)

    Imamura, Toshiyuki; Takemiya, Hiroshi; Koide, Hiroshi

    2000-03-01

    At CCSE, Center for Promotion of Computational Science and Engineering, a new message passing library for heterogeneous and distributed parallel computing has been developed, and it is called as Stampi. Stampi enables us to communicate between any combination of parallel computers as well as workstations. Currently, a Stampi system is constructed from Stampi library and Stampi/Java. It provides functions to connect a Stampi application with not only those on COMPACS, COMplex Parallel Computer System, but also applets which work on WWW browsers. This report summarizes the specifications of Stampi and details the development of its system. (author)

  16. Computer Support of Semantic Text Analysis of a Technical Specification on Designing Software

    OpenAIRE

    Zaboleeva-Zotova, Alla; Orlova, Yulia

    2009-01-01

    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated a technique of the text analysis of a technical specification is submitted, the expanded fuzzy attribute grammar of a technical specification, intended for formaliza...

  17. Computer Graphics for Multimedia and Hypermedia Development.

    Science.gov (United States)

    Mohler, James L.

    1998-01-01

    Discusses several theoretical and technical aspects of computer-graphics development that are useful for creating hypermedia and multimedia materials. Topics addressed include primary bitmap attributes in computer graphics, the jigsaw principle, and raster layering. (MSE)

  18. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  19. Dimensionally Specific Capture of Attention: Implications for Saliency Computation

    Directory of Open Access Journals (Sweden)

    Katherine E. Burnett

    2018-02-01

    Full Text Available Observers automatically orient to a sudden change in the environment. This is demonstrated experimentally using exogenous cues, which prioritize the analysis of subsequent targets appearing nearby. This effect has been attributed to the computation of saliency, obtained by combining features specific signals, which then feed back to drive attention to the salient location. An alternative possibility is that cueing directly effects target-evoked sensory responses in a feed-forward manner. We examined the effects of luminance and equiluminant color cues in a dual task paradigm, which required both a motion and a color discrimination. Equiluminant color cues improved color discrimination more than luminance cues, but luminance cues improved motion discrimination more than equiluminant color cues. This suggests that the effects of exogenous cues are dimensionally specific and may not depend entirely on the computation of a dimension general saliency signal.

  20. International Developments in Computer Science.

    Science.gov (United States)

    1982-06-01

    background on 52 53 China’s scientific research and on their computer science before 1978. A useful companion to the directory is another publication of the...bimonthly publication in Portuguese; occasional translation of foreign articles into Portuguese. Data News: A bimonthly industry newsletter. Sistemas ...computer-related topics; Spanish. Delta: Publication of local users group; Spanish. Sistemas : Publication of System Engineers of Colombia; Spanish. CUBA

  1. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial...... on secret values and results are only revealed according to specific protocols. We identify the key linguistic concepts of SMC and bridge the gap between high-level security requirements and low-level cryptographic operations constituting an SMC platform, thus improving the efficiency and security of SMC...

  2. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  3. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  4. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  5. Developing a Distributed Computing Architecture at Arizona State University.

    Science.gov (United States)

    Armann, Neil; And Others

    1994-01-01

    Development of Arizona State University's computing architecture, designed to ensure that all new distributed computing pieces will work together, is described. Aspects discussed include the business rationale, the general architectural approach, characteristics and objectives of the architecture, specific services, and impact on the university…

  6. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Moses, Alan M.; Zhang, Zhaolei

    2015-01-01

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  7. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun

    2015-11-02

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  8. Computer-aided System of Semantic Text Analysis of a Technical Specification

    OpenAIRE

    Zaboleeva-Zotova, Alla; Orlova, Yulia

    2008-01-01

    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated the model of the analysis of the text of the technical project is submitted, the attribute grammar of a technical specification, intended for formalization of limited Ru...

  9. SPECIFICITY IN DEVELOPMENT OF CONSTRUCTION INDUSTRY

    Directory of Open Access Journals (Sweden)

    O. S. Golubova

    2012-01-01

    Full Text Available Specificity in development of construction industry of the Republic of Belarus determines  character of competition on the construction market, forms a pricing, marketing and product policy of building companies. Construction represents itself as a highly developed complex where interaction of business entities is of rather complicated multilateral character.

  10. Accounting valuation development of specific assets

    Directory of Open Access Journals (Sweden)

    I.V. Zhigley

    2017-12-01

    Full Text Available The current issues of accounting estimate development are considered. The necessity of the development of accounting estimate in the context of the non-institutional theory principles based on the selection of a number of reasons is grounded. The reasons for deterioration of accounting reputation as a separate socio-economic institute in the context of developing the methodology for specific assets accounting are discovered. The system of normative regulation of accounting estimate of enterprise non-current assets in the case of diminishing their usefulness is analyzed. The procedure for determining and accounting for the depreciation of assets in accordance with IFRS 36 «Depreciation of Assets» is developed. The features of the joint use of the concept of «value in use» and «fair value» in the accounting system are disclosed. The procedure for determining the value of compensation depending on the degree of specificity of assets is developed. The necessity to clarify the features that indicate the possibility of diminishing the usefulness of specific assets (termination or pre-term termination of the contract for the use of a specific asset is grounded.

  11. Computer-aided software development

    International Nuclear Information System (INIS)

    Teichroew, D.; Hershey, E.A. III; Yamamoto, Y.

    1978-01-01

    In recent years, as the hardware cost/capability ratio has continued to decrease and as much of the routine data processing has been computerized, the emphasis in software development has shifted from just getting systems operational to the maintenance of existing systems, reduction of duplication by integration, selective addition of new applications, systems that are more usable, maintainable, portable and reliable and to improving the productivity of software developers. This paper examines a number of trends that are changing the methods by which software is being produced and used. (Auth.)

  12. Reactor safety computer code development at INEL

    International Nuclear Information System (INIS)

    Johnsen, G.W.

    1985-01-01

    This report provides a brief overview of the computer code development programs being conducted at EG and G Idaho, Inc. on behalf of US Nuclear Regulatory Commission and the Department of Energy, Idaho Operations Office. Included are descriptions of the codes being developed, their development status as of the date of this report, and resident code development expertise

  13. Southampton uni's computer whizzes develop "mini" grid

    CERN Multimedia

    Sherriff, Lucy

    2006-01-01

    "In a bid to help its students explore the potential of grid computing, the University of Southampton's Computer Science department has developed what it calls a "lightweight grid". The system has been designed to allow students to experiment with grid technology without the complexity of inherent security concerns of the real thing. (1 page)

  14. Dopamine Receptor-Specific Contributions to the Computation of Value.

    Science.gov (United States)

    Burke, Christopher J; Soutschek, Alexander; Weber, Susanna; Raja Beharelle, Anjali; Fehr, Ernst; Haker, Helene; Tobler, Philippe N

    2018-05-01

    Dopamine is thought to play a crucial role in value-based decision making. However, the specific contributions of different dopamine receptor subtypes to the computation of subjective value remain unknown. Here we demonstrate how the balance between D1 and D2 dopamine receptor subtypes shapes subjective value computation during risky decision making. We administered the D2 receptor antagonist amisulpride or placebo before participants made choices between risky options. Compared with placebo, D2 receptor blockade resulted in more frequent choice of higher risk and higher expected value options. Using a novel model fitting procedure, we concurrently estimated the three parameters that define individual risk attitude according to an influential theoretical account of risky decision making (prospect theory). This analysis revealed that the observed reduction in risk aversion under amisulpride was driven by increased sensitivity to reward magnitude and decreased distortion of outcome probability, resulting in more linear value coding. Our data suggest that different components that govern individual risk attitude are under dopaminergic control, such that D2 receptor blockade facilitates risk taking and expected value processing.

  15. A climatological model for risk computations incorporating site- specific dry deposition influences

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.

    1991-07-01

    A gradient-flux dry deposition module was developed for use in a climatological atmospheric transport model, the Multimedia Environmental Pollutant Assessment System (MEPAS). The atmospheric pathway model computes long-term average contaminant air concentration and surface deposition patterns surrounding a potential release site incorporating location-specific dry deposition influences. Gradient-flux formulations are used to incorporate site and regional data in the dry deposition module for this atmospheric sector-average climatological model. Application of these formulations provide an effective means of accounting for local surface roughness in deposition computations. Linkage to a risk computation module resulted in a need for separate regional and specific surface deposition computations. 13 refs., 4 figs., 2 tabs

  16. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  17. [Development of domain specific search engines].

    Science.gov (United States)

    Takai, T; Tokunaga, M; Maeda, K; Kaminuma, T

    2000-01-01

    As cyber space exploding in a pace that nobody has ever imagined, it becomes very important to search cyber space efficiently and effectively. One solution to this problem is search engines. Already a lot of commercial search engines have been put on the market. However these search engines respond with such cumbersome results that domain specific experts can not tolerate. Using a dedicate hardware and a commercial software called OpenText, we have tried to develop several domain specific search engines. These engines are for our institute's Web contents, drugs, chemical safety, endocrine disruptors, and emergent response for chemical hazard. These engines have been on our Web site for testing.

  18. Computed Tomography Technology: Development and Applications for Defence

    International Nuclear Information System (INIS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-01-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT and E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  19. Development of emission computed tomography in Japan

    International Nuclear Information System (INIS)

    Tanaka, E.

    1984-01-01

    Two positron emission computed tomography (PCT) devices developed in Japan are described. One is for head and the other for wholebody. The devices show fairly quantitative images with slight modifications of the existing algorithms because they were developed based on filtered back-projection. The PCT device seems to be better than the single photon emission computed tomography (SPECT) since it provides adequade compensation for photon attenuation in patients. (M.A.C.) [pt

  20. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  1. Development of industrial variant specification systems

    DEFF Research Database (Denmark)

    Hansen, Benjamin Loer

    be developed from a holistic and strategically anchored point of view. Another assumption is that this is a challenge for many industrial companies. Even though the literature presents many considerations on general issues covering new information technology, little work is found on the business perspectives...... are discussed. A list of structural variables and solution components has been created. These are related to four design aspects in the holistic system design covering the aspects of process design, selection of resources (such as hardware, software and humans), the design of information structures...... solution elements and structural variables to be used in the design of variant specification systems. The thesis presents a “top-down” procedure to be used to develop variant specification systems from a strategically anchored and holistic point of view. A methodology and related task variables...

  2. Modeling the Development of Goal-Specificity in Mirror Neurons.

    Science.gov (United States)

    Thill, Serge; Svensson, Henrik; Ziemke, Tom

    2011-12-01

    Neurophysiological studies have shown that parietal mirror neurons encode not only actions but also the goal of these actions. Although some mirror neurons will fire whenever a certain action is perceived (goal-independently), most will only fire if the motion is perceived as part of an action with a specific goal. This result is important for the action-understanding hypothesis as it provides a potential neurological basis for such a cognitive ability. It is also relevant for the design of artificial cognitive systems, in particular robotic systems that rely on computational models of the mirror system in their interaction with other agents. Yet, to date, no computational model has explicitly addressed the mechanisms that give rise to both goal-specific and goal-independent parietal mirror neurons. In the present paper, we present a computational model based on a self-organizing map, which receives artificial inputs representing information about both the observed or executed actions and the context in which they were executed. We show that the map develops a biologically plausible organization in which goal-specific mirror neurons emerge. We further show that the fundamental cause for both the appearance and the number of goal-specific neurons can be found in geometric relationships between the different inputs to the map. The results are important to the action-understanding hypothesis as they provide a mechanism for the emergence of goal-specific parietal mirror neurons and lead to a number of predictions: (1) Learning of new goals may mostly reassign existing goal-specific neurons rather than recruit new ones; (2) input differences between executed and observed actions can explain observed corresponding differences in the number of goal-specific neurons; and (3) the percentage of goal-specific neurons may differ between motion primitives.

  3. Development of Specifications for Radioactive Waste Packages

    International Nuclear Information System (INIS)

    2006-10-01

    The main objective of this publication is to provide guidelines for the development of waste package specifications that comply with waste acceptance requirements for storage and disposal of radioactive waste. It will assist waste generators and waste package producers in selecting the most significant parameters and in developing and implementing specifications for each individual type of waste and waste package. This publication also identifies and reviews the activities and technical provisions that are necessary to meet safety requirements; in particular, selection of the significant safety parameters and preparation of specifications for waste forms, waste containers and waste packages using proven approaches, methods and technologies. This report provides guidance using a systematic, stepwise approach, integrating the technical, organizational and administrative factors that need to be considered at each step of planning and implementing waste package design, fabrication, approval, quality assurance and control. The report reflects the considerable experience and knowledge that has been accumulated in the IAEA Member States and is consistent with the current international requirements, principles, standards and guidance for the safe management of radioactive waste

  4. Development of Specifications for Radioactive Waste Packages

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-10-15

    The main objective of this publication is to provide guidelines for the development of waste package specifications that comply with waste acceptance requirements for storage and disposal of radioactive waste. It will assist waste generators and waste package producers in selecting the most significant parameters and in developing and implementing specifications for each individual type of waste and waste package. This publication also identifies and reviews the activities and technical provisions that are necessary to meet safety requirements; in particular, selection of the significant safety parameters and preparation of specifications for waste forms, waste containers and waste packages using proven approaches, methods and technologies. This report provides guidance using a systematic, stepwise approach, integrating the technical, organizational and administrative factors that need to be considered at each step of planning and implementing waste package design, fabrication, approval, quality assurance and control. The report reflects the considerable experience and knowledge that has been accumulated in the IAEA Member States and is consistent with the current international requirements, principles, standards and guidance for the safe management of radioactive waste.

  5. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  6. Specifications development for "Karbatril" codenamed tablets

    Directory of Open Access Journals (Sweden)

    L. I. Kucherenko

    2017-08-01

    Full Text Available Introduction. According to current legislation of Ukraine the specifications of tablets include the following indicators: description, identification, average weight, disintegration and assay. The aim of the study. The development of specifications and project of quality control methods for "Karbatril" codenamed tablets. Materials and methods. During the study we analyzed 6 series of tablets "Karbatril." For the description, identification, determination of the average mass, disintegration, active ingredients quantify of "Karbatril" codenamed tablets we used appropriate methods and instruments. Results and discussion. Tablets "Karbatril" were analyzed for the following parameters: - Overview - Tablets white or nearly white; - Average weight - during the study the average weight of 6 series of obtained tablets ranged from 339,0 mg to 369,9 mg according to SPU from 337,0 mg to 373,0 mg; - Disintegration – according to SPU the disintegration for tablet without shell shall not exceed 15 min. Analyzed tablets disintegrated in the period from 5 to 10 minutes; - Identification and quantification of the active ingredients of tablets were conducted using modified HPLC methods. During the identification obtained chromatograms show compliance with SPU. In quantitative determination of the active ingredients content in "Karbatril" codenamed tablets we found carbamazepine from 148.18 mg to 150.19 mg, thiotriazoline - from 98.93 mg to 99.71 mg. This data is consistent to SPU which regulates content of carbamazepine - 150 mg ± 7,5%, thiotriazoline - 100 mg ± 10%. Conclusions. This study has developed specification for "Karbatril" codenamed tablets and also methods of HPLC qualitative and quantitative determination of active ingredients. In the specification the following parameters are included: description, identification, average weight, disintegration and assay. The study drafted quality control methods which are planned to be later offered to the

  7. EUV mask process specifics and development challenges

    Science.gov (United States)

    Nesladek, Pavel

    2014-07-01

    EUV lithography is currently the favorite and most promising candidate among the next generation lithography (NGL) technologies. Decade ago the NGL was supposed to be used for 45 nm technology node. Due to introduction of immersion 193nm lithography, double/triple patterning and further techniques, the 193 nm lithography capabilities was greatly improved, so it is expected to be used successfully depending on business decision of the end user down to 10 nm logic. Subsequent technology node will require EUV or DSA alternative technology. Manufacturing and especially process development for EUV technology requires significant number of unique processes, in several cases performed at dedicated tools. Currently several of these tools as e.g. EUV AIMS or actinic reflectometer are not available on site yet. The process development is done using external services /tools with impact on the single unit process development timeline and the uncertainty of the process performance estimation, therefore compromises in process development, caused by assumption about similarities between optical and EUV mask made in experiment planning and omitting of tests are further reasons for challenges to unit process development. Increased defect risk and uncertainty in process qualification are just two examples, which can impact mask quality / process development. The aim of this paper is to identify critical aspects of the EUV mask manufacturing with respect to defects on the mask with focus on mask cleaning and defect repair and discuss the impact of the EUV specific requirements on the experiments needed.

  8. Formal Specification and Analysis of Cloud Computing Management

    Science.gov (United States)

    2012-01-24

    te r Cloud Computing in a Nutshell We begin this introduction to Cloud Computing with a famous quote by Larry Ellison: “The interesting thing about...the wording of some of our ads.” — Larry Ellison, Oracle CEO [106] In view of this statement, we summarize the essential aspects of Cloud Computing...1] M. Abadi, M. Burrows , M. Manasse, and T. Wobber. Moderately hard, memory-bound functions. ACM Transactions on Internet Technology, 5(2):299–327

  9. Improving developer productivity with C++ embedded domain specific languages

    Science.gov (United States)

    Kozacik, Stephen; Chao, Evenie; Paolini, Aaron; Bonnett, James; Kelmelis, Eric

    2017-05-01

    Domain-specific languages are a useful tool for productivity allowing domain experts to program using familiar concepts and vocabulary while benefiting from performance choices made by computing experts. Embedding the domain specific language into an existing language allows easy interoperability with non-domain-specific code and use of standard compilers and build systems. In C++, this is enabled through the template and preprocessor features. C++ embedded domain specific languages (EDSLs) allow the user to write simple, safe, performant, domain specific code that has access to all the low-level functionality that C and C++ offer as well as the diverse set of libraries available in the C/C++ ecosystem. In this paper, we will discuss several tools available for building EDSLs in C++ and show examples of projects successfully leveraging EDSLs. Modern C++ has added many useful new features to the language which we have leveraged to further extend the capability of EDSLs. At EM Photonics, we have used EDSLs to allow developers to transparently benefit from using high performance computing (HPC) hardware. We will show ways EDSLs combine with existing technologies and EM Photonics high performance tools and libraries to produce clean, short, high performance code in ways that were not previously possible.

  10. Prototype development of user specific climate services

    Science.gov (United States)

    Jacob, Daniela

    2017-04-01

    Systematic consultations in the last years with representatives from sectors particularly affected by climate change have helped the Climate Service Center Germany (GERICS) to identify the most pressing needs of stakeholders from public and private sectors. Besides the development of innovative climate service products and methods, areas are also identified, for which intensive research activities have to be initiated. An example is the demand of decision makers for high-resolution climate change information needed at regional to local levels for their activities towards climate change adaptation. For questions concerning adaptation to climate change, no standard solutions can be provided. Different from mitigation measures, adaptation measures must be framed in accordance with the specific circumstances prevailing in the local situation. Here, individual solutions, which satisfy the individual requirements and needs, are necessary. They have to be developed in close co-operation with the customers and users. For example, the implications of climate change on strategic and operative decisions, e.g. in enterprises and urban planning, are becoming increasingly important. Therefore, high-quality consultancy for businesses and public administration is needed, in order to support decision makers in identifying associated risks and opportunities. For the development of prototype products, GERICS has framed a general methodological approach, including the idea generation, the iterative development, and the prototype testing in co-development with the user. High process transparency and high product quality are prerequisite for the success of a product. The co-development process ensures the best possible communication of user tailored climate change information for different target groups.

  11. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  12. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  13. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  14. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  15. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  16. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  17. Developments in Remote Collaboration and Computation

    International Nuclear Information System (INIS)

    Burruss, J.R.; Abla, G.; Flanagan, S.; Keahey, K.; Leggett, T.; Ludesche, C.; McCune, D.; Papka, M.E.; Peng, Q.; Randerson, L.; Schissel, D.P.

    2005-01-01

    The National Fusion Collaboratory (NFC) is creating and deploying collaborative software tools to unite magnetic fusion research in the United States. In particular, the NFC is developing and deploying a national FES 'Grid' (FusionGrid) for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid is to allow scientists at remote sites to participate as fully in experiments, machine design, and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community

  18. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  19. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  20. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  1. Development of site specific response spectra

    International Nuclear Information System (INIS)

    Bernreuter, D.L.; Chen, J.C.; Savy, J.B.

    1987-03-01

    For a number of years the US Nuclear Regulatory Commission (NRC) has employed site specific spectra (SSSP) in their evaluation of the adequacy of the Safe Shutdown Earthquake (SSE). These spectra were developed only from the spectra of the horizontal components of the ground motion and from a very limited data set. As the data set has considerably increased for Eastern North America (ENA) and as more relevant data has become available from earthquakes occurring in other parts of the world (e.g., Italy), together with the fact that recent data indicated the importance of the vertical component, it became clear that an update of the SSSP's for ENA was desirable. The methodology used in this study is similar to the previous ones in that it used actual earthquake ground motion data with magnitudes within a certain range and recorded at distances and at sites similar to those that would be chosen for the definition of an SSE. An extensive analysis of the origin and size of the uncertainty is an important part of this study. The results of this analysis of the uncertainties is used to develop criteria for selecting the earthquake records to be used in the derivation of the SSSP's. We concluded that the SSSPs were not very sensitive to the distribution of the source to site distance of the earthquake records used in the analysis. That is, the variability (uncertainty) introduced by the range of distances was relatively small compared to the variability introduced by other factors. We also concluded that the SSSP are somewhat sensitive to the distribution of the magnitudes of these earthquakes, particularly at rock sites and, by inference, at shallow soil sites. We found that one important criterion in selecting records to generate SSSP is the depth of soil at the site

  2. A Computational Framework to Optimize Subject-Specific Hemodialysis Blood Flow Rate to Prevent Intimal Hyperplasia

    Science.gov (United States)

    Mahmoudzadeh, Javid; Wlodarczyk, Marta; Cassel, Kevin

    2017-11-01

    Development of excessive intimal hyperplasia (IH) in the cephalic vein of renal failure patients who receive chronic hemodialysis treatment results in vascular access failure and multiple treatment complications. Specifically, cephalic arch stenosis (CAS) is known to exacerbate hypertensive blood pressure, thrombosis, and subsequent cardiovascular incidents that would necessitate costly interventional procedures with low success rates. It has been hypothesized that excessive blood flow rate post access maturation which strongly violates the venous homeostasis is the main hemodynamic factor that orchestrates the onset and development of CAS. In this article, a computational framework based on a strong coupling of computational fluid dynamics (CFD) and shape optimization is proposed that aims to identify the effective blood flow rate on a patient-specific basis that avoids the onset of CAS while providing the adequate blood flow rate required to facilitate hemodialysis. This effective flow rate can be achieved through implementation of Miller's surgical banding method after the maturation of the arteriovenous fistula and is rooted in the relaxation of wall stresses back to a homeostatic target value. The results are indicative that this optimized hemodialysis blood flow rate is, in fact, a subject-specific value that can be assessed post vascular access maturation and prior to the initiation of chronic hemodialysis treatment as a mitigative action against CAS-related access failure. This computational technology can be employed for individualized dialysis treatment.

  3. Designing English for Specific Purposes Course for Computer Science Students

    Science.gov (United States)

    Irshad, Isra; Anwar, Behzad

    2018-01-01

    The aim of this study was to design English for Academic Purposes (EAP) course for University students enrolled in the Computer Science Department. For this purpose, academic English language needs of the students were analyzed by using a 5 point Likert scale questionnaire. Additionally, interviews were also conducted with four faculty members of…

  4. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  5. Development of technical specifications for research reactors

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    This standard identifies and establishes the content of technical specifications for research reactors. Areas addressed are: definitions, safety limits, limiting safety system settings, limiting conditions for operation, surveillance requirements, design features and administrative controls. Sufficient detail is incorporated so that applicable specifications can be derived or extracted

  6. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  7. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  8. Project Development Specification for Valve Pit Manifold

    International Nuclear Information System (INIS)

    MCGREW, D.L.

    2000-01-01

    Establishes the performance, design development, and test requirements for the valve pit manifolds. The system engineering approach was used to develop this document in accordance with the guidelines laid out in the Systems Engineering Management Plan for Project W-314

  9. Project Development Specification for Special Protective Coating

    International Nuclear Information System (INIS)

    MCGREW, D.L.

    2000-01-01

    Establishes the performance, design development, and test requirements for the Special Protective Coating. The system engineering approach was used to develop this document in accordance with the guidelines laid out in the Systems Engineering Management Plan for Project W-314

  10. Scalable Computational Chemistry: New Developments and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri [Iowa State Univ., Ames, IA (United States)

    2002-01-01

    The computational part of the thesis is the investigation of titanium chloride (II) as a potential catalyst for the bis-silylation reaction of ethylene with hexaclorodisilane at different levels of theory. Bis-silylation is an important reaction for producing bis(silyl) compounds and new C-Si bonds, which can serve as monomers for silicon containing polymers and silicon carbides. Ab initio calculations on the steps involved in a proposed mechanism are presented. This choice of reactants allows them to study this reaction at reliable levels of theory without compromising accuracy. The calculations indicate that this is a highly exothermic barrierless reaction. The TiCl2 catalyst removes a 50 kcal/mol activation energy barrier required for the reaction without the catalyst. The first step is interaction of TiCl2 with ethylene to form an intermediate that is 60 kcal/mol below the energy of the reactants. This is the driving force for the entire reaction. Dynamic correlation plays a significant role because RHF calculations indicate that the net barrier for the catalyzed reaction is 50 kcal/mol. They conclude that divalent Ti has the potential to become an important industrial catalyst for silylation reactions. In the programming part of the thesis, parallelization of different quantum chemistry methods is presented. The parallelization of code is becoming important aspects of quantum chemistry code development. Two trends contribute to it: the overall desire to study large chemical systems and the desire to employ highly correlated methods which are usually computationally and memory expensive. In the presented distributed data algorithms computation is parallelized and the largest arrays are evenly distributed among CPUs. First, the parallelization of the Hartree-Fock self-consistent field (SCF) method is considered. SCF method is the most common starting point for more accurate calculations. The Fock build (sub step of SCF) from AO integrals is

  11. Development of the Tensoral Computer Language

    Science.gov (United States)

    Ferziger, Joel; Dresselhaus, Eliot

    1996-01-01

    The research scientist or engineer wishing to perform large scale simulations or to extract useful information from existing databases is required to have expertise in the details of the particular database, the numerical methods and the computer architecture to be used. This poses a significant practical barrier to the use of simulation data. The goal of this research was to develop a high-level computer language called Tensoral, designed to remove this barrier. The Tensoral language provides a framework in which efficient generic data manipulations can be easily coded and implemented. First of all, Tensoral is general. The fundamental objects in Tensoral represent tensor fields and the operators that act on them. The numerical implementation of these tensors and operators is completely and flexibly programmable. New mathematical constructs and operators can be easily added to the Tensoral system. Tensoral is compatible with existing languages. Tensoral tensor operations co-exist in a natural way with a host language, which may be any sufficiently powerful computer language such as Fortran, C, or Vectoral. Tensoral is very-high-level. Tensor operations in Tensoral typically act on entire databases (i.e., arrays) at one time and may, therefore, correspond to many lines of code in a conventional language. Tensoral is efficient. Tensoral is a compiled language. Database manipulations are simplified optimized and scheduled by the compiler eventually resulting in efficient machine code to implement them.

  12. Specificity, Transfer, and the Development of Expertise

    Science.gov (United States)

    Brookes, David T.; Ross, Brian H.; Mestre, Jose P.

    2011-01-01

    In this paper we present the results of two experiments designed to understand how physics students' learning of the concept of refraction is influenced by the cognitive phenomenon of "specificity." In both experiments participants learned why light bends as it travels from one optical medium to another with an analogy made to a car…

  13. The Effectiveness of Computer-Assisted Instruction for Teaching Mathematics to Students with Specific Learning Disability

    Science.gov (United States)

    Stultz, Sherry L.

    2013-01-01

    Using computers to teach students is not a new idea. Computers have been utilized for educational purposes for over 80 years. However, the effectiveness of these programs for teaching mathematics to students with specific learning disability is unclear. This study was undertaken to determine if computer-assisted instruction was as effective as…

  14. Specific features of vocal fold paralysis in functional computed tomography

    International Nuclear Information System (INIS)

    Laskowska, K.; Mackiewicz-Nartowicz, H.; Serafin, Z.; Nawrocka, E.

    2008-01-01

    Vocal fold paralysis is usually recognized in laryngological examination, and detailed vocal fold function may be established based on laryngovideostroboscopy. Additional imaging should exclude any morphological causes of the paresis, which should be treated pharmacologically or surgically. The aim of this paper was to analyze the computed tomography (CT) images of the larynx in patients with unilateral vocal fold paralysis. CT examinations of the larynx were performed in 10 patients with clinically defined unilateral vocal fold paralysis. The examinations consisted of unenhanced acquisition and enhanced 3-phased acquisition: during free breathing, Valsalva maneuver, and phonation. The analysis included the following morphologic features of the paresis.the deepened epiglottic vallecula, the deepened piriform recess, the thickened and medially positioned aryepiglottic fold, the widened laryngeal pouch, the anteriorly positioned arytenoid cartilage, the thickened vocal fold, and the filled infraglottic space in frontal CT reconstruction. CT images were compared to laryngovideostroboscopy. The most common symptoms of vocal cord paralysis in CT were the deepened epiglottic vallecula and piriform recess, the widened laryngeal pouch with the filled infraglottic space, and the thickened aryepiglottic fold. Regarding the efficiency of the paralysis determination, the three functional techniques of CT larynx imaging used did not differ significantly, and laryngovideostroboscopy demonstrated its advantage over CT. CT of the larynx is a supplementary examination in the diagnosis of vocal fold paralysis, which may enable topographic analysis of the fold dysfunction. The knowledge of morphological CT features of the paralysis may help to prevent false-positive diagnosis of laryngeal cancer. (author)

  15. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  16. Trends and developments in computational geometry

    NARCIS (Netherlands)

    Berg, de M.

    1997-01-01

    This paper discusses some trends and achievements in computational geometry during the past five years, with emphasis on problems related to computer graphics. Furthermore, a direction of research in computational geometry is discussed that could help in bringing the fields of computational geometry

  17. High-resolution subject-specific mitral valve imaging and modeling: experimental and computational methods.

    Science.gov (United States)

    Toma, Milan; Bloodworth, Charles H; Einstein, Daniel R; Pierce, Eric L; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S

    2016-12-01

    The diversity of mitral valve (MV) geometries and multitude of surgical options for correction of MV diseases necessitates the use of computational modeling. Numerical simulations of the MV would allow surgeons and engineers to evaluate repairs, devices, procedures, and concepts before performing them and before moving on to more costly testing modalities. Constructing, tuning, and validating these models rely upon extensive in vitro characterization of valve structure, function, and response to change due to diseases. Micro-computed tomography ([Formula: see text]CT) allows for unmatched spatial resolution for soft tissue imaging. However, it is still technically challenging to obtain an accurate geometry of the diastolic MV. We discuss here the development of a novel technique for treating MV specimens with glutaraldehyde fixative in order to minimize geometric distortions in preparation for [Formula: see text]CT scanning. The technique provides a resulting MV geometry which is significantly more detailed in chordal structure, accurate in leaflet shape, and closer to its physiological diastolic geometry. In this paper, computational fluid-structure interaction (FSI) simulations are used to show the importance of more detailed subject-specific MV geometry with 3D chordal structure to simulate a proper closure validated against [Formula: see text]CT images of the closed valve. Two computational models, before and after use of the aforementioned technique, are used to simulate closure of the MV.

  18. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  19. Evolution and Development of Effective Feedstock Specifications

    Energy Technology Data Exchange (ETDEWEB)

    Garold Gresham; Rachel Emerson; Amber Hoover; Amber Miller; William Bauer; Kevin Kenney

    2013-09-01

    The U.S. Department of Energy promotes the production of a range of liquid fuels and fuel blend stocks from lignocellulosic biomass feedstocks by funding fundamental and applied research that advances the state of technology in biomass collection, conversion, and sustainability. As part of its involvement in this program, the Idaho National Laboratory (INL) investigates the feedstock logistics economics and sustainability of these fuels. The 2012 feedstock logistics milestone demonstrated that for high-yield areas that minimize the transportation distances of a low-density, unstable biomass, we could achieve a delivered cost of $35/ton. Based on current conventional equipment and processes, the 2012 logistics design is able to deliver the volume of biomass needed to fulfill the 2012 Renewable Fuel Standard’s targets for ethanol. However, the Renewable Fuel Standard’s volume targets are continuing to increase and are expected to peak in 2022 at 36 billion gallons. Meeting these volume targets and achieving a national-scale biofuels industry will require expansion of production capacity beyond the 2012 Conventional Feedstock Supply Design Case to access diverse available feedstocks, regardless of their inherent ability to meet preliminary biorefinery quality feedstock specifications. Implementation of quality specifications (specs), as outlined in the 2017 Design Case – “Feedstock Supply System Design and Economics for Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels” (in progress), requires insertion of deliberate, active quality controls into the feedstock supply chain, whereas the 2012 Conventional Design only utilizes passive quality controls.

  20. Site-specific development plan: Carlin, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Fiore, J.H.

    1980-01-01

    The conditions for developing the geothermal resource near Carlin appear favorable. The resource has a favorable temperature range for direct applications (174/sup 0/F or 79/sup 0/C), the geothermal fluid has low total dissolved solids, no objectionable constituents that would result in costly scaling or corrosion problems and the resource is conveniently located within two miles from town. Direct space heating is the most realistic application and is recommended. Several clusters of homes are located less than 2 miles away. The project could be developed on a larger scale than this study proposes. The engineering and economic models are proposed on a small scale here for simplicity in evaluating the feasibility of pursuing development. Conceivably the producing well will provide sufficient hot water to accommodate more homes than the models include. The town of Carlin seems receptive to development and there does not appear to be any major barriers to exploration or development. The regulatory climate in both the state and county is conducive to geothermal development at this level. No major regulatory or environmental obstacles are noted which would severely curtail utilization for space heating. The prospect of replacing natural gas heat with geothermal heat for 60 or more homes is economically attractive. Geothermal rates for hot water are not expected to increase as rapidly as the price of natural gas to the consumer over the next 10 years. The increases for hot water from geothermal are primarily a function of power costs for the pumps plus inflation affecting maintenance costs. Individual homeowners can expect payback on retrofitting costs within two to three years.

  1. Development and Specification of Virtual Environments

    NARCIS (Netherlands)

    van Schooten, B.W.

    2003-01-01

    This thesis concerns the issues involved in the development of virtual environments (VEs). VEs are more than virtual reality. We identify four main characteristics of them: graphical interaction, multimodality, interface agents, and multi-user. These characteristics are illustrated with an overview

  2. The Effect of Inlet Waveforms on Computational Hemodynamics of Patient-Specific Intracranial Aneurysms

    OpenAIRE

    Xiang, J.; Siddiqui, A.H.; Meng, H.

    2014-01-01

    Due to the lack of patient-specific inlet flow waveform measurements, most computational fluid dynamics (CFD) simulations of intracranial aneurysms usually employ waveforms that are not patient-specific as inlet boundary conditions for the computational model. The current study examined how this assumption affects the predicted hemodynamics in patient-specific aneurysm geometries. We examined wall shear stress (WSS) and oscillatory shear index (OSI), the two most widely studied hemodynamic qu...

  3. Specificity, transfer, and the development of expertise

    Directory of Open Access Journals (Sweden)

    David T. Brookes

    2011-04-01

    Full Text Available In this paper we present the results of two experiments designed to understand how physics students’ learning of the concept of refraction is influenced by the cognitive phenomenon of “specificity.” In both experiments participants learned why light bends as it travels from one optical medium to another with an analogy made to a car driving from paved road into mud and vice versa. They then learned how to qualitatively draw the direction of refracted light rays with an example of a glass prism. One group learned with a rectangular prism example while a second group learned with a triangular prism example. In a transfer test, the participants revealed how, even when they seemed able to implement the refraction concept, their responses were biased by the example they had seen. Participants frequently violated the refraction principle they had just learned (reversing the bend direction in order to make sure their response matched the surface features of their learning example. This tended to happen when their test question looked superficially similar to their learning example. We discuss the implications of these results for physics instruction.

  4. Specificity, transfer, and the development of expertise

    Directory of Open Access Journals (Sweden)

    David T. Brookes

    2011-04-01

    Full Text Available In this paper we present the results of two experiments designed to understand how physics students’ learning of the concept of refraction is influenced by the cognitive phenomenon of “specificity.” In both experiments participants learned why light bends as it travels from one optical medium to another with an analogy made to a car driving from paved road into mud and vice versa. They then learned how to qualitatively draw the direction of refracted light rays with an example of a glass prism. One group learned with a rectangular prism example while a second group learned with a triangular prism example. In a transfer test, the participants revealed how, even when they seemed able to implement the refraction concept, their responses were biased by the example they had seen. Participants frequently violated the refraction principle they had just learned (reversing the bend direction in order to make sure their response matched the surface features of their learning example. This tended to happen when their test question looked superficially similar to their learning example. We discuss the implications of these results for physics instruction.

  5. Development of computer code in PNC, 8

    International Nuclear Information System (INIS)

    Ohhira, Mitsuru

    1990-01-01

    Private buildings applied base isolation system, are on the practical stage now. So, under Construction and Maintenance Management Office, we are doing an application study of base isolation system to nuclear fuel facilities. On the process of this study, we have developed Dynamic Analysis Program-Base Isolation System (DAP-BS) which is able to run a 32-bit personal computer. Using this program, we can analyze a 3-dimensional structure, and evaluate the various properties of base isolation parts that are divided into maximum 16 blocks. And from the results of some simulation analyses, we thought that DAP-BS had good reliability and marketability. So, we put DAP-BS on the market. (author)

  6. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  7. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  8. Cognitive training in Parkinson disease: cognition-specific vs nonspecific computer training.

    Science.gov (United States)

    Zimmermann, Ronan; Gschwandtner, Ute; Benz, Nina; Hatz, Florian; Schindler, Christian; Taub, Ethan; Fuhr, Peter

    2014-04-08

    In this study, we compared a cognition-specific computer-based cognitive training program with a motion-controlled computer sports game that is not cognition-specific for their ability to enhance cognitive performance in various cognitive domains in patients with Parkinson disease (PD). Patients with PD were trained with either a computer program designed to enhance cognition (CogniPlus, 19 patients) or a computer sports game with motion-capturing controllers (Nintendo Wii, 20 patients). The effect of training in 5 cognitive domains was measured by neuropsychological testing at baseline and after training. Group differences over all variables were assessed with multivariate analysis of variance, and group differences in single variables were assessed with 95% confidence intervals of mean difference. The groups were similar regarding age, sex, and educational level. Patients with PD who were trained with Wii for 4 weeks performed better in attention (95% confidence interval: -1.49 to -0.11) than patients trained with CogniPlus. In our study, patients with PD derived at least the same degree of cognitive benefit from non-cognition-specific training involving movement as from cognition-specific computerized training. For patients with PD, game consoles may be a less expensive and more entertaining alternative to computer programs specifically designed for cognitive training. This study provides Class III evidence that, in patients with PD, cognition-specific computer-based training is not superior to a motion-controlled computer game in improving cognitive performance.

  9. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  10. Computational Modeling Develops Ultra-Hard Steel

    Science.gov (United States)

    2007-01-01

    Glenn Research Center's Mechanical Components Branch developed a spiral bevel or face gear test rig for testing thermal behavior, surface fatigue, strain, vibration, and noise; a full-scale, 500-horsepower helicopter main-rotor transmission testing stand; a gear rig that allows fundamental studies of the dynamic behavior of gear systems and gear noise; and a high-speed helical gear test for analyzing thermal behavior for rotorcraft. The test rig provides accelerated fatigue life testing for standard spur gears at speeds of up to 10,000 rotations per minute. The test rig enables engineers to investigate the effects of materials, heat treat, shot peen, lubricants, and other factors on the gear's performance. QuesTek Innovations LLC, based in Evanston, Illinois, recently developed a carburized, martensitic gear steel with an ultra-hard case using its computational design methodology, but needed to verify surface fatigue, lifecycle performance, and overall reliability. The Battelle Memorial Institute introduced the company to researchers at Glenn's Mechanical Components Branch and facilitated a partnership allowing researchers at the NASA Center to conduct spur gear fatigue testing for the company. Testing revealed that QuesTek's gear steel outperforms the current state-of-the-art alloys used for aviation gears in contact fatigue by almost 300 percent. With the confidence and credibility provided by the NASA testing, QuesTek is commercializing two new steel alloys. Uses for this new class of steel are limitless in areas that demand exceptional strength for high throughput applications.

  11. Computationally Developed Sham Stimulation Protocol for Multichannel Desynchronizing Stimulation

    Directory of Open Access Journals (Sweden)

    Magteld Zeitler

    2018-05-01

    Full Text Available A characteristic pattern of abnormal brain activity is abnormally strong neuronal synchronization, as found in several brain disorders, such as tinnitus, Parkinson's disease, and epilepsy. As observed in several diseases, different therapeutic interventions may induce a placebo effect that may be strong and hinder reliable clinical evaluations. Hence, to distinguish between specific, neuromodulation-induced effects and unspecific, placebo effects, it is important to mimic the therapeutic procedure as precisely as possibly, thereby providing controls that actually lack specific effects. Coordinated Reset (CR stimulation has been developed to specifically counteract abnormally strong synchronization by desynchronization. CR is a spatio-temporally patterned multichannel stimulation which reduces the extent of coincident neuronal activity and aims at an anti-kindling, i.e., an unlearning of both synaptic connectivity and neuronal synchrony. Apart from acute desynchronizing effects, CR may cause sustained, long-lasting desynchronizing effects, as already demonstrated in pre-clinical and clinical proof of concept studies. In this computational study, we set out to computationally develop a sham stimulation protocol for multichannel desynchronizing stimulation. To this end, we compare acute effects and long-lasting effects of six different spatio-temporally patterned stimulation protocols, including three variants of CR, using a no-stimulation condition as additional control. This is to provide an inventory of different stimulation algorithms with similar fundamental stimulation parameters (e.g., mean stimulation rates but qualitatively different acute and/or long-lasting effects. Stimulation protocols sharing basic parameters, but inducing nevertheless completely different or even no acute effects and/or after-effects, might serve as controls to validate the specific effects of particular desynchronizing protocols such as CR. In particular, based on

  12. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    International Nuclear Information System (INIS)

    SCAIEF, C.C.

    1999-01-01

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring

  13. Hierarchy, determinism, and specificity in theories of development and evolution.

    Science.gov (United States)

    Deichmann, Ute

    2017-10-16

    The concepts of hierarchical organization, genetic determinism and biological specificity (for example of species, biologically relevant macromolecules, or genes) have played a crucial role in biology as a modern experimental science since its beginnings in the nineteenth century. The idea of genetic information (specificity) and genetic determination was at the basis of molecular biology that developed in the 1940s with macromolecules, viruses and prokaryotes as major objects of research often labelled "reductionist". However, the concepts have been marginalized or rejected in some of the research that in the late 1960s began to focus additionally on the molecularization of complex biological structures and functions using systems approaches. This paper challenges the view that 'molecular reductionism' has been successfully replaced by holism and a focus on the collective behaviour of cellular entities. It argues instead that there are more fertile replacements for molecular 'reductionism', in which genomics, embryology, biochemistry, and computer science intertwine and result in research that is as exact and causally predictive as earlier molecular biology.

  14. Development and application of computational aerothermodynamics flowfield computer codes

    Science.gov (United States)

    Venkatapathy, Ethiraj

    1993-01-01

    Computations are presented for one-dimensional, strong shock waves that are typical of those that form in front of a reentering spacecraft. The fluid mechanics and thermochemistry are modeled using two different approaches. The first employs traditional continuum techniques in solving the Navier-Stokes equations. The second-approach employs a particle simulation technique (the direct simulation Monte Carlo method, DSMC). The thermochemical models employed in these two techniques are quite different. The present investigation presents an evaluation of thermochemical models for nitrogen under hypersonic flow conditions. Four separate cases are considered. The cases are governed, respectively, by the following: vibrational relaxation; weak dissociation; strong dissociation; and weak ionization. In near-continuum, hypersonic flow, the nonequilibrium thermochemical models employed in continuum and particle simulations produce nearly identical solutions. Further, the two approaches are evaluated successfully against available experimental data for weakly and strongly dissociating flows.

  15. Computer-Assisted Mathematics Instruction for Students with Specific Learning Disability: A Review of the Literature

    Science.gov (United States)

    Stultz, Sherry L.

    2017-01-01

    This review was conducted to evaluate the current body of scholarly research regarding the use of computer-assisted instruction (CAI) to teach mathematics to students with specific learning disability (SLD). For many years, computers are utilized for educational purposes. However, the effectiveness of CAI for teaching mathematics to this specific…

  16. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  17. The Effects of Computer Graphic Organizers on the Persuasive Writing of Hispanic Middle School Students with Specific Learning Disabilities

    Science.gov (United States)

    Unzueta, Caridad H.; Barbetta, Patricia M.

    2012-01-01

    A multiple baseline design investigated the effects of computer graphic organizers on the persuasive composition writing skills of four Hispanic students with specific learning disabilities. Participants reviewed the elements of persuasive writing and then developed compositions using a word processing program. Baseline planning was done with a…

  18. Design and manufacturing of patient-specific orthodontic appliances by computer-aided engineering techniques.

    Science.gov (United States)

    Barone, Sandro; Neri, Paolo; Paoli, Alessandro; Razionale, Armando Viviano

    2018-01-01

    Orthodontic treatments are usually performed using fixed brackets or removable oral appliances, which are traditionally made from alginate impressions and wax registrations. Among removable devices, eruption guidance appliances are used for early orthodontic treatments in order to intercept and prevent malocclusion problems. Commercially available eruption guidance appliances, however, are symmetric devices produced using a few standard sizes. For this reason, they are not able to meet all the specific patient's needs since the actual dental anatomies present various geometries and asymmetric conditions. In this article, a computer-aided design-based methodology for the design and manufacturing of a patient-specific eruption guidance appliances is presented. The proposed approach is based on the digitalization of several steps of the overall process: from the digital reconstruction of patients' anatomies to the manufacturing of customized appliances. A finite element model has been developed to evaluate the temporomandibular joint disks stress level caused by using symmetric eruption guidance appliances with different teeth misalignment conditions. The developed model can then be used to guide the design of a patient-specific appliance with the aim at reducing the patient discomfort. At this purpose, two different customization levels are proposed in order to face both arches and single tooth misalignment issues. A low-cost manufacturing process, based on an additive manufacturing technique, is finally presented and discussed.

  19. Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Chen Jiun-Ching

    2007-05-01

    Full Text Available Abstract Background Genome-wide identification of specific oligonucleotides (oligos is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos. Results We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes. Conclusion The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through

  20. Subject-specific computer simulation model for determining elbow loading in one-handed tennis backhand groundstrokes.

    Science.gov (United States)

    King, Mark A; Glynn, Jonathan A; Mitchell, Sean R

    2011-11-01

    A subject-specific angle-driven computer model of a tennis player, combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts, was developed to determine the effect of ball-racket impacts on loading at the elbow for one-handed backhand groundstrokes. Matching subject-specific computer simulations of a typical topspin/slice one-handed backhand groundstroke performed by an elite tennis player were done with root mean square differences between performance and matching simulations of elbow loading for a topspin and slice one-handed backhand groundstroke is relatively small. In this study, the relatively small differences in elbow loading may be due to comparable angle-time histories at the wrist and elbow joints with the major kinematic differences occurring at the shoulder. Using a subject-specific angle-driven computer model combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts allows peak internal loading, net impulse, and shock due to ball-racket impact to be calculated which would not otherwise be possible without impractical invasive techniques. This study provides a basis for further investigation of the factors that may increase elbow loading during tennis strokes.

  1. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  2. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  3. Developments of the general computer network of NIPNE-HH

    International Nuclear Information System (INIS)

    Mirica, M.; Constantinescu, S.; Danet, A.

    1997-01-01

    Since 1991 the general computer network of NIPNE-HH was developed and connected to RNCN (Romanian National Computer Network) for research and development and it offers to the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNCN is targeted on the following main objectives: Setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - Providing a rapid and competitive tool for the exchange of information in the framework of Research and Development (R-D) community; - Using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - Providing a support for information, scientific and technical co-operation. RNCN has two international links: to EBONE via ACONET (64kbps) and to EuropaNET via Hungarnet (64 kbps). The guiding principle in designing the project of general computer network of NIPNE-HH, as part of RNCN, was to implement an open system based on OSI standards taking into account the following criteria: - development of a flexible solution, according to OSI specifications; - solutions of reliable gateway with the existing network already in use,allowing the access to the worldwide networks; - using the TCP/IP transport protocol for each Local Area Network (LAN) and for the connection to RNCN; - ensuring the integration of different and heterogeneous software and hardware platforms (DOS, Windows, UNIX, VMS, Linux, etc) through some specific interfaces. The major objectives achieved in direction of developing the general computer network of NIPNE-HH are: - linking all the existing and newly installed computer equipment and providing an adequate connectivity. LANs from departments

  4. Computer-Aided Sensor Development Focused on Security Issues

    Directory of Open Access Journals (Sweden)

    Andrzej Bialas

    2016-05-01

    Full Text Available The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  5. Computer-Aided Sensor Development Focused on Security Issues.

    Science.gov (United States)

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  6. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  7. Development of personnel exposure management system with personal computer

    International Nuclear Information System (INIS)

    Yamato, Ichiro; Yamamoto, Toshiki

    1992-01-01

    In nuclear power plants, large scale personnel exposure management systems have been developed and established by utilities. Though being common in the base, the implementations are specific by plants. Contractors must control their workers' exposures by their own methods and systems. To comply with the utilities' parental systems, contractors' systems tend to differ by plants, thus make it difficult for contractors to design a standard system that is common to all relevant plants. Circumstances being as such, however, we have developed a system which is applicable to various customer utilities with minimal variations, using personal computers with database management and data communication softwares, with relatively low cost. We hope that this system will develop to the standard model for all Japanese contractors' personnel exposure management systems. (author)

  8. Latest developments for a computer aided thermohydraulic network

    International Nuclear Information System (INIS)

    Alemberti, A.; Graziosi, G.; Mini, G.; Susco, M.

    1999-01-01

    Thermohydraulic networks are I-D systems characterized by a small number of basic components (pumps, valves, heat exchangers, etc) connected by pipes and limited spatially by a defined number of boundary conditions (tanks, atmosphere, etc). The network system is simulated by the well known computer program RELAPS/mod3. Information concerning the network geometry component behaviour, initial and boundary conditions are usually supplied to the RELAPS code using an ASCII input file by means of 'input cards'. CATNET (Computer Aided Thermalhydraulic NETwork) is a graphically user interface that, under specific user guidelines which completely define its range of applicability, permits a very high level of standardization and simplification of the RELAPS/mod3 input deck development process as well as of the output processing. The characteristics of the components (pipes, valves, pumps etc), defining the network system can be entered through CATNET. The CATNET interface is provided by special functions to compute form losses in the most typical bending and branching configurations. When the input of all system components is ready, CATNET is able to generate the RELAPS/mod3 input file. Finally, by means of CATNET, the RELAPS/mod3 code can be run and its output results can be transformed to an intuitive display form. The paper presents an example of application of the CATNET interface as well as the latest developments which greatly simplified the work of the users and allowed to reduce the possibility of input errors. (authors)

  9. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  10. Frequency of educational computer use as a longitudinal predictor of educational outcome in young people with specific language impairment.

    Directory of Open Access Journals (Sweden)

    Kevin Durkin

    Full Text Available Computer use draws on linguistic abilities. Using this medium thus presents challenges for young people with Specific Language Impairment (SLI and raises questions of whether computer-based tasks are appropriate for them. We consider theoretical arguments predicting impaired performance and negative outcomes relative to peers without SLI versus the possibility of positive gains. We examine the relationship between frequency of computer use (for leisure and educational purposes and educational achievement; in particular examination performance at the end of compulsory education and level of educational progress two years later. Participants were 49 young people with SLI and 56 typically developing (TD young people. At around age 17, the two groups did not differ in frequency of educational computer use or leisure computer use. There were no associations between computer use and educational outcomes in the TD group. In the SLI group, after PIQ was controlled for, educational computer use at around 17 years of age contributed substantially to the prediction of educational progress at 19 years. The findings suggest that educational uses of computers are conducive to educational progress in young people with SLI.

  11. Ada Integrated Environment II Computer Program Development Specification. Part 1.

    Science.gov (United States)

    1981-12-01

    34Programmable" access 3.2.5.5 controls ; provision for privileged 3.2.5.6 user. 3.3.1 3.3.3 4.1.2.11 3.7.1.2 KDBS - 3.2.5.7 Capability to archive data base...CM -1 1 PHASE I SOW REQUIREMENTS A - SPEC B5 -SPEC 4.111. 3.7.2 1Compiler -331 aMAPSE shall include a mechanism for 1 Linker -3.2.5 aautomatic stub...19 3.2.5.5 Process Administrator The Process Administrator controls the executions of logically concurrent MAPSE processes. The KFW Interface Package

  12. Computer Program Development Specification for Tactical Interface System.

    Science.gov (United States)

    1981-07-31

    CNTL CNTL TO ONE VT~i.AE CR1 & TWELVE VT100 LCARD READER VIDEO TERMINALS, SIX LA12O) HARD- COPY TERMINALS, & VECTOR GRAPHICS RPO % TERMINAL 17%M DISK...this data into the TIS para - .. meter tables in the TISGBL common area. ICEHANDL will send test interface ICE to PSS in one of two modes: perio- dically...STOPCauss te TI sotwar toexit ,9.*9~ .r .~ * ~%.’h .9~ .. a .~ .. a. 1 , , p * % .’.-:. .m 7 P : SDSS-MMP-BI ." 31 July 1981 TCL commands authorized

  13. A way forward for the development of an exposure computational model to computed tomography dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, C.C., E-mail: cassio.c.ferreira@gmail.co [Nucleo de Fisica, Universidade Federal de Sergipe, Itabaiana-SE, CEP 49500-000 (Brazil); Galvao, L.A., E-mail: lailagalmeida@gmail.co [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil); Vieira, J.W., E-mail: jose.wilson59@uol.com.b [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife-PE, CEP 50740-540 (Brazil); Escola Politecnica de Pernambuco, Universidade de Pernambuco, Recife-PE, CEP 50720-001 (Brazil); Maia, A.F., E-mail: afmaia@ufs.b [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil)

    2011-04-15

    A way forward for the development of an exposure computational model to computed tomography dosimetry has been presented. In this way, an exposure computational model (ECM) for computed tomography (CT) dosimetry has been developed and validated through comparison with experimental results. For the development of the ECM, X-ray spectra generator codes have been evaluated and the head bow tie filter has been modelled through a mathematical equation. EGS4 and EGSnrc have been used for simulating the radiation transport by the ECM. Geometrical phantoms, commonly used in CT dosimetry, have been modelled by IDN software. MAX06 has also been used to simulate an adult male patient submitted for CT examinations. The evaluation of the X-ray spectra generator codes in CT dosimetry showed dependence with tube filtration (or HVL value). More generally, with the increment of total filtration (or HVL value) the X-raytbc becomes the best X-ray spectra generator code for CT dosimetry. The EGSnrc/X-raytbc combination has calculated C{sub 100,c} in better concordance with C{sub 100,c} measured in two different CT scanners. For a Toshiba CT scanner, the average percentage difference between the calculated C{sub 100,c} values and measured C{sub 100,c} values was 8.2%. Whilst for a GE CT scanner, the average percentage difference was 10.4%. By the measurements of air kerma through a prototype head bow tie filter a third-order exponential decay equation was found. C{sub 100,c} and C{sub 100,p} values calculated by the ECM are in good agreement with values measured at a specific CT scanner. A maximum percentage difference of 2% has been found in the PMMA CT head phantoms, demonstrating effective modelling of the head bow tie filter by the equation. The absorbed and effective doses calculated by the ECM developed in this work have been compared to those calculated by the ECM of Jones and Shrimpton for an adult male patient. For a head examination the absorbed dose values calculated by the

  14. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  15. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  16. Impact of Computer Aided Learning on Children with Specific Learning Disabilities

    OpenAIRE

    The Spastic Society Of Karnataka , Bangalore

    2004-01-01

    Study conducted by The Spastics Society of Karnataka on behalf of Azim Premji Foundation to assess the effectiveness of computers in enhancing learning for children with specific learning disabilities. Azim Premji Foundation is not liable for any direct or indirect loss or damage whatsoever arising from the use or access of any information, interpretation and conclusions that may be printed in this report.; Study to assess the effectiveness of computers in enhancing learning for children with...

  17. Exploring gender differences on general and specific computer self-efficacy in mobile learning adoption

    OpenAIRE

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi; Kibelloh, Mboni

    2014-01-01

    Reasons for contradictory findings regarding the gender moderate effect on computer self-efficacy in the adoption of e-learning/mobile learning are limited. Recognizing the multilevel nature of the computer self-efficacy (CSE), this study attempts to explore gender differences in the adoption of mobile learning, by extending the Technology Acceptance Model (TAM) with general and specific CSE. Data collected from 137 university students were tested against the research model using the structur...

  18. Web Program for Development of GUIs for Cluster Computers

    Science.gov (United States)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  19. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    Science.gov (United States)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  20. STEEP4 code for computation of specific thermonuclear reaction rates from pointwise cross sections

    International Nuclear Information System (INIS)

    Harris, D.R.; Dei, D.E.; Husseiny, A.A.; Sabri, Z.A.; Hale, G.M.

    1976-05-01

    A code module, STEEP4, is developed to calculate the fusion reaction rates in terms of the specific reactivity [sigma v] which is the product of cross section and relative velocity averaged over the actual ion distributions of the interacting particles in the plasma. The module is structured in a way suitable for incorporation in thermonuclear burn codes to provide rapid and yet relatively accurate on-line computation of [sigma v] as a function of plasma parameters. Ion distributions are modified to include slowing-down contributions which are characterized in terms of plasma parameters. Rapid and accurate algorithms are used for integrating [sigma v] from cross sections and spectra. The main program solves for [sigma v] by the method of steepest descent. However, options are provided to use Gauss-Hermite and dense trapezoidal quadrature integration techniques. Options are also provided for rapid calculation of screening effects on specific reaction rates. Although such effects are not significant in cases of plasmas of laboratory interest, the options are included to increase the range of applicability of the code. Gamow penetration form, log-log interpolation, and cubic interpolation routines are included to provide the interpolated values of cross sections

  1. Assessment of CT dose to the fetus and pregnant female patient using patient-specific computational models

    DEFF Research Database (Denmark)

    Xie, Tianwu; Poletti, Pierre-Alexandre; Platon, Alexandra

    2018-01-01

    of pregnant patients and the embedded foetus, we developed a methodology for construction of patient-specific voxel-based computational phantoms based on existing standardised hybrid computational pregnant female phantoms. We estimated the maternal absorbed dose and foetal organ dose for 30 pregnant patients...... for assessment of the radiation risks to pregnant patients and the foetus from various CT scanning protocols, thus guiding the decision-making process. KEY POINTS: • In CT examinations, the absorbed dose is non-uniformly distributed within foetal organs. • This work reports, for the first time, estimates...

  2. Finding-specific display presets for computed radiography soft-copy reading.

    Science.gov (United States)

    Andriole, K P; Gould, R G; Webb, W R

    1999-05-01

    Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding-specific

  3. Cloud Computing: Key to IT Development in West Africa | Nwabuonu ...

    African Journals Online (AJOL)

    It has been established that Information Technology (IT) Development in West Africa has faced lots of challenges ranging from Cyber Threat to inadequate IT Infrastructure. Cloud Computing is a Revolution. It is creating a fundamental change in Computer Architecture, Software and Tools Development iIn the way we Store, ...

  4. Isolation of developing secondary xylem specific cellulose synthase ...

    Indian Academy of Sciences (India)

    The present study aimed at identifying developing secondary xylem specific cellulose synthase genes from .... the First strand cDNA synthesis kit (Fermentas, Pittsburgh,. USA). .... ing height of the rooted cutting, girth of the stem, leaf area.

  5. Developing criteria for performance-based concrete specifications.

    Science.gov (United States)

    2013-07-01

    For more than 50 years now, concrete technology has advanced, but CDOT specifications for durability have : remained mostly unchanged. The minimum cement content for a given strength is derived from mix design : guidelines that were developed before ...

  6. ATLAS computing activities and developments in the Italian Grid cloud

    International Nuclear Information System (INIS)

    Rinaldi, L; Ciocca, C; K, M; Annovi, A; Antonelli, M; Martini, A; Barberis, D; Brunengo, A; Corosu, M; Barberis, S; Carminati, L; Campana, S; Di, A; Capone, V; Carlino, G; Doria, A; Esposito, R; Merola, L; De, A; Luminari, L

    2012-01-01

    The large amount of data produced by the ATLAS experiment needs new computing paradigms for data processing and analysis, which involve many computing centres spread around the world. The computing workload is managed by regional federations, called “clouds”. The Italian cloud consists of a main (Tier-1) center, located in Bologna, four secondary (Tier-2) centers, and a few smaller (Tier-3) sites. In this contribution we describe the Italian cloud facilities and the activities of data processing, analysis, simulation and software development performed within the cloud, and we discuss the tests of the new computing technologies contributing to evolution of the ATLAS Computing Model.

  7. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  8. Craniofacial reconstruction using patient-specific implants polyether ether ketone with computer-assisted planning.

    Science.gov (United States)

    Manrique, Oscar J; Lalezarzadeh, Frank; Dayan, Erez; Shin, Joseph; Buchbinder, Daniel; Smith, Mark

    2015-05-01

    Reconstruction of bony craniofacial defects requires precise understanding of the anatomic relationships. The ideal reconstructive technique should be fast as well as economical, with minimal donor-site morbidity, and provide a lasting and aesthetically pleasing result. There are some circumstances in which a patient's own tissue is not sufficient to reconstruct defects. The development of sophisticated software has facilitated the manufacturing of patient-specific implants (PSIs). The aim of this study was to analyze the utility of polyether ether ketone (PEEK) PSIs for craniofacial reconstruction. We performed a retrospective chart review from July 2009 to July 2013 in patients who underwent craniofacial reconstruction using PEEK-PSIs using a virtual process based on computer-aided design and computer-aided manufacturing. A total of 6 patients were identified. The mean age was 46 years (16-68 y). Operative indications included cancer (n = 4), congenital deformities (n = 1), and infection (n = 1). The mean surgical time was 3.7 hours and the mean hospital stay was 1.5 days. The mean surface area of the defect was 93.4 ± 43.26 cm(2), the mean implant cost was $8493 ± $837.95, and the mean time required to manufacture the implants was 2 weeks. No major or minor complications were seen during the 4-year follow-up. We found PEEK implants to be useful in the reconstruction of complex calvarial defects, demonstrating a low complication rate, good outcomes, and high patient satisfaction in this small series of patients. Polyether ether ketone implants show promising potential and warrant further study to better establish the role of this technology in cranial reconstruction.

  9. Results from Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, K.

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  10. Results From Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, Kevin [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  11. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  12. Wide-angle display developments by computer graphics

    Science.gov (United States)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  13. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  14. Formulation, development and evaluation of colon-specific ketorolac ...

    African Journals Online (AJOL)

    The major intention to formulate and develop colon targeted tablets is to improve the therapeutic efficacy by increasing therapeutic drug concentrations in colon. The present study was aimed to develop guar gum compression coated tablets ketorolac tromethamine to achieve the colon-specific drug release. In this study ...

  15. Computing for Lattice QCD: new developments from the APE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R [INFN, Sezione di Roma Tor Vergata, Roma (Italy); Biagioni, A; De Luca, S [INFN, Sezione di Roma, Roma (Italy)

    2008-06-15

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  16. Computing for Lattice QCD: new developments from the APE experiment

    International Nuclear Information System (INIS)

    Ammendola, R.; Biagioni, A.; De Luca, S.

    2008-01-01

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  17. Disease-Specific Care: Spine Surgery Program Development.

    Science.gov (United States)

    Koerner, Katie; Franker, Lauren; Douglas, Barbara; Medero, Edgardo; Bromeland, Jennifer

    2017-10-01

    Minimal literature exists describing the process for development of a Joint Commission comprehensive spine surgery program within a community hospital health system. Components of a comprehensive program include structured communication across care settings, preoperative education, quality outcomes tracking, and patient follow-up. Organizations obtaining disease-specific certification must have clear knowledge of the planning, time, and overall commitment, essential to developing a successful program. Health systems benefit from disease-specific certification because of their commitment to a higher standard of service. Certification standards establish a framework for organizational structure and management and provide institutions a competitive edge in the marketplace. A framework for the development of a spine surgery program is described to help guide organizations seeking disease-specific certification. In developing a comprehensive program, it is critical to define the program's mission and vision, identify key stakeholders, implement clinical practice guidelines, and evaluate program outcomes.

  18. Tissue-type-specific transcriptome analysis identifies developing xylem-specific promoters in poplar.

    Science.gov (United States)

    Ko, Jae-Heung; Kim, Hyun-Tae; Hwang, Ildoo; Han, Kyung-Hwan

    2012-06-01

    Plant biotechnology offers a means to create novel phenotypes. However, commercial application of biotechnology in crop improvement programmes is severely hindered by the lack of utility promoters (or freedom to operate the existing ones) that can drive gene expression in a tissue-specific or temporally controlled manner. Woody biomass is gaining popularity as a source of fermentable sugars for liquid fuel production. To improve the quantity and quality of woody biomass, developing xylem (DX)-specific modification of the feedstock is highly desirable. To develop utility promoters that can drive transgene expression in a DX-specific manner, we used the Affymetrix Poplar Genome Arrays to obtain tissue-type-specific transcriptomes from poplar stems. Subsequent bioinformatics analysis identified 37 transcripts that are specifically or strongly expressed in DX cells of poplar. After further confirmation of their DX-specific expression using semi-quantitative PCR, we selected four genes (DX5, DX8, DX11 and DX15) for in vivo confirmation of their tissue-specific expression in transgenic poplars. The promoter regions of the selected DX genes were isolated and fused to a β-glucuronidase (GUS)-reported gene in a binary vector. This construct was used to produce transgenic poplars via Agrobacterium-mediated transformation. The GUS expression patterns of the resulting transgenic plants showed that these promoters were active in the xylem cells at early seedling growth and had strongest expression in the developing xylem cells at later growth stages of poplar. We conclude that these DX promoters can be used as a utility promoter for DX-specific biomass engineering. © 2012 The Authors. Plant Biotechnology Journal © 2012 Society for Experimental Biology, Association of Applied Biologists and Blackwell Publishing Ltd.

  19. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  20. Computer-aided preparation of specifications for radial fans at VEB Lufttechnische Anlagen Berlin

    Energy Technology Data Exchange (ETDEWEB)

    Kubis, R.; Kull, W.

    1987-01-01

    The specification details the scope of delivery for radial fans on a standard page and also serves the preparation for production. In the place of previous manual preparation, a computer-aided technique for the office computer is presented that provides the technical parameters from data files out of few input data to identify the fan type. The data files and evaluative programs are based on the software tool REDABAS and the SCP operating system. Using this technique it has been possible to cut considerably the preparation time for the incoming orders.

  1. Communications, Computers and Automation for Development.

    Science.gov (United States)

    Pool, Ithiel de Sola; And Others

    This paper includes three articles dealing with the application of science and technology to national development. In part, the first article attempts to answer the following questions: 1) what will be the costs and effects of communication technology in the coming decade; 2) how can the elements of communication systems be examined in terms of…

  2. A Computational Model of Spatial Development

    Science.gov (United States)

    Hiraki, Kazuo; Sashima, Akio; Phillips, Steven

    Psychological experiments on children's development of spatial knowledge suggest experience at self-locomotion with visual tracking as important factors. Yet, the mechanism underlying development is unknown. We propose a robot that learns to mentally track a target object (i.e., maintaining a representation of an object's position when outside the field-of-view) as a model for spatial development. Mental tracking is considered as prediction of an object's position given the previous environmental state and motor commands, and the current environment state resulting from movement. Following Jordan & Rumelhart's (1992) forward modeling architecture the system consists of two components: an inverse model of sensory input to desired motor commands; and a forward model of motor commands to desired sensory input (goals). The robot was tested on the `three cups' paradigm (where children are required to select the cup containing the hidden object under various movement conditions). Consistent with child development, without the capacity for self-locomotion the robot's errors are self-center based. When given the ability of self-locomotion the robot responds allocentrically.

  3. Development of computer code in PNC, 3

    International Nuclear Information System (INIS)

    Ohtaki, Akira; Ohira, Hiroaki

    1990-01-01

    Super-COPD, a code which is integrated by calculation modules, has been developed in order to evaluate kinds of dynamics of LMFBR plant by improving COPD. The code involves all models and its advanced models of COPD in module structures. The code makes it possible to simulate the system dynamics of LMFBR plant of any configurations and components. (author)

  4. Electronic Mail for Personal Computers: Development Issues.

    Science.gov (United States)

    Tomer, Christinger

    1994-01-01

    Examines competing, commercially developed electronic mail programs and how these technologies will affect the functionality and quality of electronic mail. How new standards for client-server mail systems are likely to enhance messaging capabilities and the use of electronic mail for information retrieval are considered. (Contains eight…

  5. Species-Specific Mechanisms of Neuron Subtype Specification Reveal Evolutionary Plasticity of Amniote Brain Development

    Directory of Open Access Journals (Sweden)

    Tadashi Nomura

    2018-03-01

    Full Text Available Summary: Highly ordered brain architectures in vertebrates consist of multiple neuron subtypes with specific neuronal connections. However, the origin of and evolutionary changes in neuron specification mechanisms remain unclear. Here, we report that regulatory mechanisms of neuron subtype specification are divergent in developing amniote brains. In the mammalian neocortex, the transcription factors (TFs Ctip2 and Satb2 are differentially expressed in layer-specific neurons. In contrast, these TFs are co-localized in reptilian and avian dorsal pallial neurons. Multi-potential progenitors that produce distinct neuronal subtypes commonly exist in the reptilian and avian dorsal pallium, whereas a cis-regulatory element of avian Ctip2 exhibits attenuated transcription suppressive activity. Furthermore, the neuronal subtypes distinguished by these TFs are not tightly associated with conserved neuronal connections among amniotes. Our findings reveal the evolutionary plasticity of regulatory gene functions that contribute to species differences in neuronal heterogeneity and connectivity in developing amniote brains. : Neuronal heterogeneity is essential for assembling intricate neuronal circuits. Nomura et al. find that species-specific transcriptional mechanisms underlie diversities of excitatory neuron subtypes in mammalian and non-mammalian brains. Species differences in neuronal subtypes and connections suggest functional plasticity of regulatory genes for neuronal specification during amniote brain evolution. Keywords: Ctip2, Satb2, multi-potential progenitors, transcriptional regulation, neuronal connectivity

  6. Development Of A Navier-Stokes Computer Code

    Science.gov (United States)

    Yoon, Seokkwan; Kwak, Dochan

    1993-01-01

    Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.

  7. Experiment-specific analyses in support of code development

    International Nuclear Information System (INIS)

    Ott, L.J.

    1990-01-01

    Experiment-specific models have been developed since 1986 by Oak Ridge National Laboratory Boiling Water Reactor (BWR) severe accident analysis programs for the purpose of BWR experimental planning and optimum interpretation of experimental results. These experiment-specific models have been applied to large integral tests (ergo, experiments) which start from an initial undamaged core state. The tests performed to date in BWR geometry have had significantly different-from-prototypic boundary and experimental conditions because of either normal facility limitations or specific experimental constraints. These experiments (ACRR: DF-4, NRU: FLHT-6, and CORA) were designed to obtain specific phenomenological information such as the degradation and interaction of prototypic components and the effects on melt progression of control-blade materials and channel boxes. Applications of ORNL models specific to the ACRR DF-4 and KfK CORA-16 experiments are discussed and significant findings from the experimental analyses are presented. 32 refs., 16 figs

  8. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  9. Development of computer aided engineering system for TRAC applications

    International Nuclear Information System (INIS)

    Arai, Kenji; Itoya, Seihiro; Uematsu, Hitoshi; Tsunoyama, Shigeaki

    1990-01-01

    An advanced best estimate computer program for nuclear reactor transient analysis, TRAC has been extensively used to carry out various thermal hydraulic calculations in the nuclear engineering field, because of its versatility. To perform efficiently a wide variety of TRAC calculation, the efficient utilization of computers and the convenient environment for input and output processing is necessary. We have applied a computer network comprising a super-computer, engineering work stations and personal computers to TRAC calculations and have assigned the appropriate functions to each computer. We have also been developing an interactive graphics system for input and output processing on an EWS. This hardware and software environment can improve the effectiveness of TRAC utilization for various thermal hydraulic calculations. (author)

  10. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  11. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  12. Development of a computer design system for HVAC

    International Nuclear Information System (INIS)

    Miyazaki, Y.; Yotsuya, M.; Hasegawa, M.

    1993-01-01

    The development of a computer design system for HVAC (Heating, Ventilating and Air Conditioning) system is presented in this paper. It supports the air conditioning design for a nuclear power plant and a reprocessing plant. This system integrates various computer design systems which were developed separately for the various design phases of HVAC. the purposes include centralizing the HVAC data, optimizing design, and reducing the designing time. The centralized HVAC data are managed by a DBMS (Data Base Management System). The DBMS separates the computer design system into a calculation module and the data. The design system can thus be expanded easily in the future. 2 figs

  13. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  14. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  15. PHYSICAL EDUCATION AND INDIVIDUAL CHARACTERISTICS OF THE AGE SPECIFIC DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Е. М. Revenko

    2017-01-01

    Full Text Available The aim of this paper is scientific substantiation of the importance of the individual characteristics of the age specific youth development which will result in the rational modelling of students’ physical education.Methodology and research methods. The methods involve collection of experimental data carried out by means of evaluation of motor abilities and general intelligence of students. Motor abilities of students were studied by measuring of strength (dead lift dynamometry, strength endurance (pull-up, speed and power abilities (standing jump, as well as speed ability (running 30, 60 or 100 m, depending on age, aerobic endurance (running 1000 or 3000 m, depending on age. The dynamics of integral physical preparedness (DIPP of each student was calculated by calculation the arithmetic mean values of the growth rates of the development of motor abilities. Assessment of General Intelligence (GI of the 8th, 10th and 11th-grades school pupils as well as the 1st to 3rd year students was carried out through the test of R. Amthauer in the adaptation of L. A. Yazykova, and school pupils of the 6th grade were assessed through the Intelligent Test (GIT.Results. Discrepancies in the dynamics of the mental and motor areas development of maturing personality, which are interpreted as individual characteristics of the age specific development are experimentally revealed. Individual psychological differences leading to the different susceptibility to the development of motor and intellectual abilities appearing in adolescence and early adolescence are analysed. A leading role of activity in formation of the individual characteristics of the age specific development is substantiated. The conclusion of necessity to formulate to the students differing in individual characteristics of the age specific development differentiated in the complexity requirements and motor tasks in the course of physical training is made.Scientific novelty. For the first time

  16. Specifics of psychomotor development in group of congenital blind children

    Directory of Open Access Journals (Sweden)

    Zbyněk Janečka

    2011-03-01

    Full Text Available Ontogenesis of the psychomotor development in group of congenital blind children has its own specifics. Visual defect is influenced by many things. In the period from birth to two years of age occur in children, significant changes in cognitive, psychomotor and social development. Compared with the normal sighted population go the development of congenital blind children in all these areas slower. Visual deprivation also influenced on development of body posture. More important is whether the development proceeds in stages that correspond to the development of normal vision child. If development proceeds in the right direction is the temporal aspect criterion rather orientation. For blind children is also important to strengthen the ability to correctly identify their own body through somatognosy. Stereognosy in turn determines the degree of contact with the outer world and focus it in relation to the physical schema.

  17. COMPUTER MODELING IN THE DEVELOPMENT OF ARTIFICIAL VENTRICLES OF HEART

    Directory of Open Access Journals (Sweden)

    L. V. Belyaev

    2011-01-01

    Full Text Available In article modern researches of processes of development of artificial ventricles of heart are described. Advanta- ges of application computer (CAD/CAE technologies are shown by development of artificial ventricles of heart. The systems developed with application of the given technologies are submitted. 

  18. Use of Bloom's Taxonomy in Developing Reading Comprehension Specifications

    Science.gov (United States)

    Luebke, Stephen; Lorie, James

    2013-01-01

    This article is a brief account of the use of Bloom's Taxonomy of Educational Objectives (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) by staff of the Law School Admission Council in the 1990 development of redesigned specifications for the Reading Comprehension section of the Law School Admission Test. Summary item statistics for the…

  19. Sex-specific asymmetry in eye development in interspecific hybrids ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 94; Issue 3. Sex-specific asymmetry in eye development in interspecific hybrids in the Drosophila bipectinata species complex. Bashisth N. Singh Parul Banerjee. Research Note Volume 94 Issue 3 September 2015 pp 493-495 ...

  20. Teachers' Acceptance of Absenteeism: Towards Developing a Specific Scale

    Science.gov (United States)

    Shapira-Lishchinsky, Orly; Ishan, Gamal

    2013-01-01

    Purpose: This study aims to develop and validate a measure of a specific attitude toward teachers' absenteeism that predicts this behavior more accurately than other general measures of job attitudes. Design/methodology/approach: Participants were 443 teachers from 21 secondary schools in Israel. In the first phase, the teachers answered anonymous…

  1. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    Science.gov (United States)

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  2. Refining VHDL Specifications Through Conformance Testing: Case Study of an Adaptive Computing Architecture

    National Research Council Canada - National Science Library

    Duale, Ali

    1999-01-01

    .... Such an integration will allow for the removal of costly mistakes from a specification at an early stage of the development process before they propagate into different implementations, possibly...

  3. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  4. Computer Aided Design System for Developing Musical Fountain Programs

    Institute of Scientific and Technical Information of China (English)

    刘丹; 张乃尧; 朱汉城

    2003-01-01

    A computer aided design system for developing musical fountain programs was developed with multiple functions such as intelligent design, 3-D animation, manual modification and synchronized motion to make the development process more efficient. The system first analyzed the music form and sentiment using many basic features of the music to select a basic fountain program. Then, this program is simulated with 3-D animation and modified manually to achieve the desired results. Finally, the program is transformed to a computer control program to control the musical fountain in time with the music. A prototype system for the musical fountain was also developed. It was tested with many styles of music and users were quite satisfied with its performance. By integrating various functions, the proposed computer aided design system for developing musical fountain programs greatly simplified the design of the musical fountain programs.

  5. A Methodology For The Development Of Complex Domain Specific Languages

    CERN Document Server

    Risoldi, Matteo; Falquet, Gilles

    2010-01-01

    The term Domain-Specific Modeling Language is used in software development to indicate a modeling (and sometimes programming) language dedicated to a particular problem domain, a particular problem representation technique and/or a particular solution technique. The concept is not new -- special-purpose programming language and all kinds of modeling/specification languages have always existed, but the term DSML has become more popular due to the rise of domain-specific modeling. Domain-specific languages are considered 4GL programming languages. Domain-specific modeling techniques have been adopted for a number of years now. However, the techniques and frameworks used still suffer from problems of complexity of use and fragmentation. Although in recent times some integrated environments are seeing the light, it is not common to see many concrete use cases in which domain-specific modeling has been put to use. The main goal of this thesis is tackling the domain of interactive systems and applying a DSML-based...

  6. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  7. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  8. Sector specific features of innovative development in the Russian economy

    Directory of Open Access Journals (Sweden)

    Krasyuk I.A.

    2017-01-01

    Full Text Available The present-day development of the Russian Federation is mostly shaped by the innovative development capacity of the national economy. Industry forms the core of such development. The paper presents the study of the current state of the innovative development in industry being the source of the basic innovations, as well as identifies the limitation factors. The authors have found out that low innovative activity of the industrial enterprises holds back the implementation of innovations in the related sectors, for instance, trade. It has also been found out that the course of the innovative development in the field of trade is also shaped by the sector-specific factors, which should be considered to adjust the innovation policy of a trading enterprise.

  9. Development of technology performance specifications for volatile organic compounds

    International Nuclear Information System (INIS)

    Purdy, C.; Schutte, W.E.

    1993-01-01

    The Office of Technology Development (OTD) within the Office of Environmental Restoration and Waste Management of the Department of Energy has a mission to deliver needed and usable technologies to its customers. The primary customers are individuals and organizations performing environmental characterization and remediation, waste cleanup, and pollution prevention at DOE sites. DOE faces a monumental task in cleaning up the dozen or so major sites and hundreds of smaller sites that were or are used to produce the US nuclear weapons arsenal and to develop nuclear technologies for national defense and for peaceful purposes. Contaminants and waste materials include the radionuclides associated with nuclear weapons, such as plutonium and tritium, and more common pollutants and wastes of industrial activity such as chromium, chlorinated solvents, and polychlorinated biphenyls (PCBs). Quite frequently hazardous wastes regulated by the Environmental Protection Agency are co-mingled with radioactive wastes regulated by the Nuclear Regulatory Commission to yield a open-quotes mixed waste,close quotes which increases the cleanup challenges from several perspectives. To help OTD and its investigators meet DOE's cleanup goal, technology performance specifications are being implemented for research and development and DT ampersand E projects. Technology performance specifications or open-quotes performance goalsclose quotes describe, quantitatively where possible, the technology development needs being addressed. These specifications are used to establish milestones, evaluate the status of ongoing projects, and determine the success of completed projects

  10. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  11. Computer codes developed in FRG to analyse hypothetical meltdown accidents

    International Nuclear Information System (INIS)

    Hassmann, K.; Hosemann, J.P.; Koerber, H.; Reineke, H.

    1978-01-01

    It is the purpose of this paper to give the status of all significant computer codes developed in the core melt-down project which is incorporated in the light water reactor safety research program of the Federal Ministry of Research and Technology. For standard pressurized water reactors, results of some computer codes will be presented, describing the course and the duration of the hypothetical core meltdown accident. (author)

  12. Development of a computer writing system based on EOG

    OpenAIRE

    López, A.; Ferrero, F.; Yangüela, D.; Álvarez, C.; Postolache, O.

    2017-01-01

    WOS:000407517600044 (Nº de Acesso Web of Science) The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical i...

  13. Development of a UNIX network compatible reactivity computer

    International Nuclear Information System (INIS)

    Sanchez, R.F.; Edwards, R.M.

    1996-01-01

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably

  14. DEVELOPMENT OF COMPUTER AIDED DESIGN OF CHAIN COUPLING

    Directory of Open Access Journals (Sweden)

    Sergey Aleksandrovich Sergeev

    2015-12-01

    Full Text Available The present paper describes the development stages of computer-aided design of chain couplings. The first stage is the automation of traditional design techniques (intermediate automation. The second integrated automation with the development of automated equipment and production technology, including on the basis of flexible manufacturing systems (high level of automation.

  15. Young Children's Computer Skills Development from Kindergarten to Third Grade

    Science.gov (United States)

    Sackes, Mesut; Trundle, Kathy Cabe; Bell, Randy L.

    2011-01-01

    This investigation explores young children's computer skills development from kindergarten to third grade using the Early Childhood Longitudinal Study-Kindergarten (ECLS-K) dataset. The sample size of the study was 8642 children. Latent growth curve modeling analysis was used as an analytical tool to examine the development of children's computer…

  16. Computer-Aided Template for Model Reuse, Development and Maintenance

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2014-01-01

    A template-based approach for model development is presented in this work. Based on a model decomposition technique, the computer-aided template concept has been developed. This concept is implemented as a software tool , which provides a user-friendly interface for following the workflow steps...

  17. Development of a solar-powered residential air conditioner: System optimization preliminary specification

    Science.gov (United States)

    Rousseau, J.; Hwang, K. C.

    1975-01-01

    Investigations aimed at the optimization of a baseline Rankine cycle solar powered air conditioner and the development of a preliminary system specification were conducted. Efforts encompassed the following: (1) investigations of the use of recuperators/regenerators to enhance the performance of the baseline system, (2) development of an off-design computer program for system performance prediction, (3) optimization of the turbocompressor design to cover a broad range of conditions and permit operation at low heat source water temperatures, (4) generation of parametric data describing system performance (COP and capacity), (5) development and evaluation of candidate system augmentation concepts and selection of the optimum approach, (6) generation of auxiliary power requirement data, (7) development of a complete solar collector-thermal storage-air conditioner computer program, (8) evaluation of the baseline Rankine air conditioner over a five day period simulating the NASA solar house operation, and (9) evaluation of the air conditioner as a heat pump.

  18. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  19. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  1. Development of Organ-Specific Donor Risk Indices

    OpenAIRE

    Akkina, Sanjeev K.; Asrani, Sumeet K.; Peng, Yi; Stock, Peter; Kim, Ray; Israni, Ajay K.

    2012-01-01

    Due to the shortage of deceased donor organs, transplant centers accept organs from marginal deceased donors, including older donors. Organ-specific donor risk indices have been developed to predict graft survival using various combinations of donor and recipient characteristics. We will review the kidney donor risk index (KDRI) and liver donor risk index (LDRI) and compare and contrast their strengths, limitations, and potential uses. The Kidney Donor Risk Index has a potential role in devel...

  2. Age Specifics of Cognitive Activity Development in Preschool Age

    OpenAIRE

    Klopotova E.E.; Samkova I.A.

    2017-01-01

    This paper present results of the research on the specifics of cognitive activity development in preschool children. The hypothesis tested was that content and dynamic components of cognitive activity reveal themselves in a different way depending on the stage of preschool childhood. The authors reviewed the diagnostic tools suitable for studying cognitive activity in preschoolers and selected the techniques. The research proved that content and dynamic components of cognitive activity have t...

  3. Coordiantion by using Product Specifications in Product Development

    DEFF Research Database (Denmark)

    Terkelsen, Søren Bendix

    1997-01-01

    This paper is based on a case study. It treats the coordination by generating product specifications in product development. This paper contains three very important aspects, which cause a need for coordination, and call attention to the coordination mechanisms. The three aspects are task...... uncertainty, task complexity, and dependencies between activities. If one want to select coordination mechanisms, which improve the performance in product development, it is very important to have a knowledge about these three aspects. In the following the aspects are identified in the literature...

  4. Assessment of CT dose to the fetus and pregnant female patient using patient-specific computational models

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu; Poletti, Pierre-Alexandre; Platon, Alexandra; Becker, Christoph D. [Geneva University Hospital, Department of Medical Imaging and Information Sciences, Geneva (Switzerland); Zaidi, Habib [Geneva University Hospital, Department of Medical Imaging and Information Sciences, Geneva (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); University Medical Center Groningen, Department of Nuclear Medicine and Molecular Imaging, University of Groningen, Groningen (Netherlands); University of Southern Denmark, Department of Nuclear Medicine, Odense (Denmark); Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva (Switzerland)

    2018-03-15

    This work provides detailed estimates of the foetal dose from diagnostic CT imaging of pregnant patients to enable the assessment of the diagnostic benefits considering the associated radiation risks. To produce realistic biological and physical representations of pregnant patients and the embedded foetus, we developed a methodology for construction of patient-specific voxel-based computational phantoms based on existing standardised hybrid computational pregnant female phantoms. We estimated the maternal absorbed dose and foetal organ dose for 30 pregnant patients referred to the emergency unit of Geneva University Hospital for abdominal CT scans. The effective dose to the mother varied from 1.1 mSv to 2.0 mSv with an average of 1.6 mSv, while commercial dose-tracking software reported an average effective dose of 1.9 mSv (range 1.7-2.3 mSv). The foetal dose normalised to CTDI{sub vol} varies between 0.85 and 1.63 with an average of 1.17. The methodology for construction of personalised computational models can be exploited to estimate the patient-specific radiation dose from CT imaging procedures. Likewise, the dosimetric data can be used for assessment of the radiation risks to pregnant patients and the foetus from various CT scanning protocols, thus guiding the decision-making process. (orig.)

  5. Specific features of physical development in extremely premature infants

    Directory of Open Access Journals (Sweden)

    G. A. Alyamovskaya

    2015-01-01

    Full Text Available The literature review deals with the specilic features of physical development in extremely premature infants weighing less than 1500 g at birth. It describes the regularities of an increment in basic physical development parameters (weight, height, and head circumference within the first year of life. Genetic factors, the specific features of a neonatal period, comorbidity, and different feeding types are shown to affect the increment rates of the physical development parameters. Emphasis is placed on the early initiation of enteral feeding and on the long-term use of fortified foods in low birthweight premature babies for the correction of energy deficiency resulting from preterm birth. The review shows that there is a relationship of the long-term outcomes of physical and psychomotor developments in low birthweight premature babies.

  6. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    International Nuclear Information System (INIS)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants

  7. Design of Deinococcus radiodurans thioredoxin reductase with altered thioredoxin specificity using computational alanine mutagenesis

    OpenAIRE

    Obiero, Josiah; Sanders, David AR

    2011-01-01

    In this study, the X-ray crystal structure of the complex between Escherichia coli thioredoxin reductase (EC TrxR) and its substrate thioredoxin (Trx) was used as a guide to design a Deinococcus radiodurans TrxR (DR TrxR) mutant with altered Trx specificity. Previous studies have shown that TrxRs have higher affinity for cognate Trxs (same species) than that for Trxs from different species. Computational alanine scanning mutagenesis and visual inspection of the EC TrxR–Trx interface suggested...

  8. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  9. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  10. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  11. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  12. Development of a system of computer codes for severe accident analyses and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1991-12-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy.

  13. Development of a system of computer codes for severe accident analyses and its applications

    International Nuclear Information System (INIS)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan

    1991-12-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy

  14. Computational model design specification for Phase 1 of the Hanford Environmental Dose Reconstruction Project

    International Nuclear Information System (INIS)

    Napier, B.A.

    1991-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs

  15. Computational model design specification for Phase 1 of the Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.

    1991-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs.

  16. Software requirements specification document for the AREST code development

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.; Whitney, P.D.; Gray, W.J.; Williford, R.E.; White, M.D.; Eslinger, P.W.; Altenhofen, M.K.

    1993-11-01

    The Analysis of the Repository Source Term (AREST) computer code was selected in 1992 by the U.S. Department of Energy. The AREST code will be used to analyze the performance of an underground high level nuclear waste repository. The AREST code is being modified by the Pacific Northwest Laboratory (PNL) in order to evaluate the engineered barrier and waste package designs, model regulatory compliance, analyze sensitivities, and support total systems performance assessment modeling. The current version of the AREST code was developed to be a very useful tool for analyzing model uncertainties and sensitivities to input parameters. The code has also been used successfully in supplying source-terms that were used in a total systems performance assessment. The current version, however, has been found to be inadequate for the comparison and selection of a design for the waste package. This is due to the assumptions and simplifications made in the selection of the process and system models. Thus, the new version of the AREST code will be designed to focus on the details of the individual processes and implementation of more realistic models. This document describes the requirements of the new models that will be implemented. Included in this document is a section describing the near-field environmental conditions for this waste package modeling, description of the new process models that will be implemented, and a description of the computer requirements for the new version of the AREST code

  17. Development of a Computer Writing System Based on EOG.

    Science.gov (United States)

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  18. Development of a Computer Writing System Based on EOG

    Directory of Open Access Journals (Sweden)

    Alberto López

    2017-06-01

    Full Text Available The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1 A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2 A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3 A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  19. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  20. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  1. Correction of computed tomography motion artifacts using pixel-specific back-projection

    International Nuclear Information System (INIS)

    Ritchie, C.J.; Crawford, C.R.; Godwin, J.D.; Kim, Y. King, K.F.

    1996-01-01

    Cardiac and respiratory motion can cause artifacts in computed tomography scans of the chest. The authors describe a new method for reducing these artifacts called pixel-specific back-projection (PSBP). PSBP reduces artifacts caused by in-plane motion by reconstructing each pixel in a frame of reference that moves with the in-plane motion in the volume being scanned. The motion of the frame of reference is specified by constructing maps that describe the motion of each pixel in the image at the time each projection was measured; these maps are based on measurements of the in-plane motion. PSBP has been tested in computer simulations and with volunteer data. In computer simulations, PSBP removed the structured artifacts caused by motion. In scans of two volunteers, PSBP reduced doubling and streaking in chest scans to a level that made the images clinically useful. PSBP corrections of liver scans were less satisfactory because the motion of the liver is predominantly superior-inferior (S-I). PSBP uses a unique set of motion parameters to describe the motion at each point in the chest as opposed to requiring that the motion be described by a single set of parameters. Therefore, PSBP may be more useful in correcting clinical scans than are other correction techniques previously described

  2. Development of a lion-specific interferon-gamma assay.

    Science.gov (United States)

    Maas, M; van Kooten, P J S; Schreuder, J; Morar, D; Tijhaar, E; Michel, A L; Rutten, V P M G

    2012-10-15

    The ongoing spread of bovine tuberculosis (BTB) in African free-ranging lion populations, for example in the Kruger National Park, raises the need for diagnostic assays for BTB in lions. These, in addition, would be highly relevant for zoological gardens worldwide that want to determine the BTB status of their lions, e.g. for translocations. The present study concerns the development of a lion-specific IFN-γ assay, following the production and characterization of monoclonal antibodies specific for lion interferon-gamma (IFN-γ). Recombinant lion IFN-γ (rLIFN-γ) was produced in mammalian cells and used to immunize mice to establish hybridoma cell lines producing monoclonal antibodies. These were used to develop a sensitive, lion IFN-γ-specific capture ELISA, able to detect rLIFN-γ to the level of 160 pg/ml. Recognition of native lion IFN-γ was shown in an initial assessment of supernatants of mitogen stimulated whole blood cultures of 11 known BTB-negative lions. In conclusion, the capture ELISA shows potential as a diagnostic assay for bovine tuberculosis in lions. Preliminary results also indicate the possible use of the test for other (feline) species. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Development of the computer network of IFIN-HH

    International Nuclear Information System (INIS)

    Danet, A.; Mirica, M.; Constantinescu, S.

    1998-01-01

    The general computer network of Horia Hulubei National Institute for Physics and Nuclear Engineering (IFIN-HH), as part of RNC (Romanian National Computer Network for scientific research and technological development), offers the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNC is the national project co-ordinated and established by the Ministry of Research and Technology targeted on the following main objectives: - setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - providing a rapid and competitive tool for the exchange information in the framework of R-D community; - using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - providing a support for information, documentation, scientific and technical co-operation. The guiding principle in elaborating the project of general computer network of IFIN-HH was to implement an open system based on OSI standards without technical barriers in communication between different communities using different computing hardware and software. The major objectives achieved in 1997 in the direction of developing the general computer network of IFIN-HH (over 250 computers connected) were: - connecting all the existing and newly installed computer equipment and providing an adequate connectivity; - providing the usual Internet services: e-mail, ftp, telnet, finger, gopher; - providing access to the World Wide Web resources; - providing on-line statistics of IP traffic (input and output) of each node of the domain computer network; - improving the performance of the connection with the central node RNC. (authors)

  4. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  5. Locally specific measures for employment aimed at regional development

    Directory of Open Access Journals (Sweden)

    Vladimir Cini

    2013-12-01

    Full Text Available The oldest and largest sub-region in the world functioning on the principle of economic union is the European Union. The creation of a single market has initiated the process of conditional adjustment of markets in the EU member states, which has a significant impact on the social welfare of its citizens. It is necessary to tackle this issue by joint efforts within the European Union. As globalization processes push for economic integration and development of competitive advantage, the regions will have to make some challenging adjustments. The development tends to concentrate in highly competitive regions, while regions in the periphery lag behind. However, this pertains not only to the economic lag, but also to a potential negative political situation. Locally specific active employment policy measures are a continuation of the effort to make these measures more flexible. They refer to the Joint Assessment of Employment Policy Priorities and the IPA Human Resources Development Operational Programme - a regional policy instrument of the European Union. Both documents highlight the issue of disproportional development of regions, which requires special local measures and active labour market policy programmes. To reduce regional differences in development, it is necessary to invest more resources in the regions that lag behind. In this particular case, this means the counties in Croatia with high unemployment rates, a large number of registered unemployed persons and low employment rate. Consequently, this paper explains the importance of the adoption of locally specific measures for employment, which unfortunately did not take hold in the Republic of Croatia, and highlights the need for further decentralization of public services, with the aim of balancing regional development

  6. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  7. Development of a standard for computer program verification and control

    International Nuclear Information System (INIS)

    Dunn, T.E.; Ozer, O.

    1980-01-01

    It is expected that adherence to the guidelines of the ANS 10.4 will: 1. Provide confidence that the program conforms to its requirements specification; 2. Provide confidence that the computer program has been adequately evaluated and tested; 3. Provide confidence that program changes are adequately evaluated, tested, and controlled; and 4. Enhance assurance that reliable data will be produced for engineering, scientific, and safety analysis purposes

  8. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  9. Nuclear reactor pressure vessel-specific flaw distribution development

    International Nuclear Information System (INIS)

    Rosinski, S.T.

    1992-01-01

    Vessel integrity predictions performed through fracture mechanics analysis of a pressurized thermal shock event have been shown to be significantly sensitive to the overall flaw distribution input. It has also been shown that modem vessel in-service inspection (ISI) results can be used for development of vessel flaw distribution(s) that are more representative of US vessels. This paper describes the development and application of a methodology to analyze ISI data for the purpose of flaw distribution determination. The resultant methodology considers detection reliability, flaw sizing accuracy, and flaw detection threshold in its application. Application of the methodology was then demonstrated using four recently acquired US PWR vessel inspection data sets. Throughout the program, new insight was obtained into several key inspection performance and vessel integrity prediction practice issues that will impact future vessel integrity evaluation. For example, the potential application of a vessel-specific flaw distribution now provides at least one method by which a vessel-specific reference flaw size applicable to pressure-temperature limit curves determination can be estimated. This paper will discuss the development and application of the methodology and the impact to future vessel integrity analyses

  10. Laboratory Works Designed for Developing Student Motivation in Computer Architecture

    Directory of Open Access Journals (Sweden)

    Petre Ogrutan

    2017-02-01

    Full Text Available In light of the current difficulties related to maintaining the students’ interest and to stimulate their motivation for learning, the authors have developed a range of new laboratory exercises intended for first-year students in Computer Science as well as for engineering students after completion of at least one course in computers. The educational goal of the herein proposed laboratory exercises is to enhance the students’ motivation and creative thinking by organizing a relaxed yet competitive learning environment. The authors have developed a device including LEDs and switches, which is connected to a computer. By using assembly language, commands can be issued to flash several LEDs and read the states of the switches. The effectiveness of this idea was confirmed by a statistical study.

  11. Computational design, construction, and characterization of a set of specificity determining residues in protein-protein interactions.

    Science.gov (United States)

    Nagao, Chioko; Izako, Nozomi; Soga, Shinji; Khan, Samia Haseeb; Kawabata, Shigeki; Shirai, Hiroki; Mizuguchi, Kenji

    2012-10-01

    Proteins interact with different partners to perform different functions and it is important to elucidate the determinants of partner specificity in protein complex formation. Although methods for detecting specificity determining positions have been developed previously, direct experimental evidence for these amino acid residues is scarce, and the lack of information has prevented further computational studies. In this article, we constructed a dataset that is likely to exhibit specificity in protein complex formation, based on available crystal structures and several intuitive ideas about interaction profiles and functional subclasses. We then defined a "structure-based specificity determining position (sbSDP)" as a set of equivalent residues in a protein family showing a large variation in their interaction energy with different partners. We investigated sequence and structural features of sbSDPs and demonstrated that their amino acid propensities significantly differed from those of other interacting residues and that the importance of many of these residues for determining specificity had been verified experimentally. Copyright © 2012 Wiley Periodicals, Inc.

  12. The effect of inlet waveforms on computational hemodynamics of patient-specific intracranial aneurysms.

    Science.gov (United States)

    Xiang, J; Siddiqui, A H; Meng, H

    2014-12-18

    Due to the lack of patient-specific inlet flow waveform measurements, most computational fluid dynamics (CFD) simulations of intracranial aneurysms usually employ waveforms that are not patient-specific as inlet boundary conditions for the computational model. The current study examined how this assumption affects the predicted hemodynamics in patient-specific aneurysm geometries. We examined wall shear stress (WSS) and oscillatory shear index (OSI), the two most widely studied hemodynamic quantities that have been shown to predict aneurysm rupture, as well as maximal WSS (MWSS), energy loss (EL) and pressure loss coefficient (PLc). Sixteen pulsatile CFD simulations were carried out on four typical saccular aneurysms using 4 different waveforms and an identical inflow rate as inlet boundary conditions. Our results demonstrated that under the same mean inflow rate, different waveforms produced almost identical WSS distributions and WSS magnitudes, similar OSI distributions but drastically different OSI magnitudes. The OSI magnitude is correlated with the pulsatility index of the waveform. Furthermore, there is a linear relationship between aneurysm-averaged OSI values calculated from one waveform and those calculated from another waveform. In addition, different waveforms produced similar MWSS, EL and PLc in each aneurysm. In conclusion, inlet waveform has minimal effects on WSS, OSI distribution, MWSS, EL and PLc and a strong effect on OSI magnitude, but aneurysm-averaged OSI from different waveforms has a strong linear correlation with each other across different aneurysms, indicating that for the same aneurysm cohort, different waveforms can consistently stratify (rank) OSI of aneurysms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. SPECIFIC FEATURES OF DEVELOPMENT OF ORGANIC PRODUCTS MARKET IN UKRAINE

    Directory of Open Access Journals (Sweden)

    T. Kharchenko

    2013-08-01

    Full Text Available The article is dedicated to the development of new and improvement of existing theoretical and methodological basis of forming and developing the market of organic products, its correspondence to the present-day situation, determination of problems and ways of their solving, introduction in practical activity of Ukrainian enterprises. The main objective of the article is to determine the specific features of forming and developing organic products market in Ukraine, and the perspective directions of its development based on analysis and practice of functioning of such markets in the world. The environmentally sound products market in the world is being analyzed, some information on the countries with the most commodity turnover of organic products, structure of international market of organic products, volumes of sales of organic products in the European countries is provided. As a result of studying the modern trends of economic development the authors reach a conclusion on problems of standard introduction, investigate the European norms and requirements for organic products. The conducted research allows distinguishing the basic features of Ukrainian market of organic products: it quickly grows, which makes it especially appealing for the participants of market relations, however entry into this market requires considerable capital investments and is characterized by high risk; criteria for qualifying products as environmentally sound products are unstructured and unclear. The potential for growth of organic products market in Ukraine is examined.

  14. Caltech computer scientists develop FAST protocol to speed up Internet

    CERN Multimedia

    2003-01-01

    "Caltech computer scientists have developed a new data transfer protocol for the Internet fast enough to download a full-length DVD movie in less than five seconds. The protocol is called FAST, standing for Fast Active queue management Scalable Transmission Control Protocol" (1 page).

  15. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  16. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    Science.gov (United States)

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  17. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  18. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  19. Recent developments and applications in mathematics and computer science

    International Nuclear Information System (INIS)

    Churchhouse, R.F.; Tahir Shah, K.; Zanella, P.

    1991-01-01

    The book contains 8 invited lectures and 4 short seminars presented at the College on Recent Developments and Applications in Mathematics and Computer Science held in Trieste from 7 May to 1 June 1990. A separate abstract was prepared for each paper. Refs, figs and tabs

  20. Feasibility of replacing patient specific cutouts with a computer-controlled electron multileaf collimator

    International Nuclear Information System (INIS)

    Eldib, Ahmed; Jin Lihui; Li Jinsheng; Ma, C-M Charlie

    2013-01-01

    A motorized electron multileaf collimator (eMLC) was developed as an add-on device to the Varian linac for delivery of advanced electron beam therapy. It has previously been shown that electron beams collimated by an eMLC have very similar penumbra to those collimated by applicators and cutouts. Thus, manufacturing patient specific cutouts would no longer be necessary, resulting in the reduction of time taken in the cutout fabrication process. Moreover, cutout construction involves handling of toxic materials and exposure to toxic fumes that are usually generated during the process, while the eMLC will be a pollution-free device. However, undulation of the isodose lines is expected due to the finite size of the eMLC. Hence, the provided planned target volume (PTV) shape will not exactly follow the beam's-eye-view of the PTV, but instead will make a stepped approximation to the PTV shape. This may be a problem when the field edge is close to a critical structure. Therefore, in this study the capability of the eMLC to achieve the same clinical outcome as an applicator/cutout combination was investigated based on real patient computed tomographies (CTs). An in-house Monte Carlo based treatment planning system was used for dose calculation using ten patient CTs. For each patient, two plans were generated; one with electron beams collimated using the applicator/cutout combination; and the other plan with beams collimated by the eMLC. Treatment plan quality was compared for each patient based on dose distribution and dose–volume histogram. In order to determine the optimal position of the leaves, the impact of the different leaf positioning strategies was investigated. All plans with both eMLC and cutouts were generated such that 100% of the target volume receives at least 90% of the prescribed dose. Then the percentage difference in dose between both delivery techniques was calculated for all the cases. The difference in the dose received by 10% of the volume of the

  1. Feasibility of replacing patient specific cutouts with a computer-controlled electron multileaf collimator

    Science.gov (United States)

    Eldib, Ahmed; Jin, Lihui; Li, Jinsheng; Ma, C.-M. Charlie

    2013-08-01

    A motorized electron multileaf collimator (eMLC) was developed as an add-on device to the Varian linac for delivery of advanced electron beam therapy. It has previously been shown that electron beams collimated by an eMLC have very similar penumbra to those collimated by applicators and cutouts. Thus, manufacturing patient specific cutouts would no longer be necessary, resulting in the reduction of time taken in the cutout fabrication process. Moreover, cutout construction involves handling of toxic materials and exposure to toxic fumes that are usually generated during the process, while the eMLC will be a pollution-free device. However, undulation of the isodose lines is expected due to the finite size of the eMLC. Hence, the provided planned target volume (PTV) shape will not exactly follow the beam's-eye-view of the PTV, but instead will make a stepped approximation to the PTV shape. This may be a problem when the field edge is close to a critical structure. Therefore, in this study the capability of the eMLC to achieve the same clinical outcome as an applicator/cutout combination was investigated based on real patient computed tomographies (CTs). An in-house Monte Carlo based treatment planning system was used for dose calculation using ten patient CTs. For each patient, two plans were generated; one with electron beams collimated using the applicator/cutout combination; and the other plan with beams collimated by the eMLC. Treatment plan quality was compared for each patient based on dose distribution and dose-volume histogram. In order to determine the optimal position of the leaves, the impact of the different leaf positioning strategies was investigated. All plans with both eMLC and cutouts were generated such that 100% of the target volume receives at least 90% of the prescribed dose. Then the percentage difference in dose between both delivery techniques was calculated for all the cases. The difference in the dose received by 10% of the volume of the

  2. Computer-Aided Discovery of Formal Specification Behavioral Requirements and Requirement to Implementation Mappings

    Science.gov (United States)

    2014-01-01

    the executable SRM is developed according to the specification and marketing documents. Hence, for example, the Vehicles, Car, and Truck classes in...transitions ternary relation: transitions ⊆ states x transitionIDs x states, such as <"Init", "Tr1", " stP "> • A conditions unary relation bound to

  3. Development of an EGFRvIII specific recombinant antibody

    Directory of Open Access Journals (Sweden)

    Li Gordon

    2010-10-01

    Full Text Available Abstract Background EGF receptor variant III (EGFRvIII is the most common variant of the EGF receptor observed in human tumors. It results from the in frame deletion of exons 2-7 and the generation of a novel glycine residue at the junction of exons 1 and 8. This novel juxtaposition of amino acids within the extra-cellular domain of the EGF receptor creates a tumor specific and immunogenic epitope. EGFRvIII expression has been seen in many tumor types including glioblastoma multiforme (GBM, breast adenocarcinoma, non-small cell lung carcinoma, ovarian adenocarcinoma and prostate cancer, but has been rarely observed in normal tissue. Because this variant is tumor specific and highly immunogenic, it can be used for both a diagnostic marker as well as a target for immunotherapy. Unfortunately many of the monoclonal and polyclonal antibodies directed against EGFRvIII have cross reactivity to wild type EGFR or other non-specific proteins. Furthermore, a monoclonal antibody to EGFRvIII is not readily available to the scientific community. Results In this study, we have developed a recombinant antibody that is specific for EGFRvIII, has little cross reactivity for the wild type receptor, and which can be easily produced. We initially designed a recombinant antibody with two anti-EGFRvIII single chain Fv's linked together and a human IgG1 Fc component. To enhance the specificity of this antibody for EGFRvIII, we mutated tyrosine H59 of the CDRH2 domain and tyrosine H105 of the CDRH3 domain to phenylalanine for both the anti-EGFRvIII sequence inserts. This mutated recombinant antibody, called RAbDMvIII, specifically detects EGFRvIII expression in EGFRvIII expressing cell lines as well as in EGFRvIII expressing GBM primary tissue by western blot, immunohistochemistry (IHC and immunofluorescence (IF and FACS analysis. It does not recognize wild type EGFR in any of these assays. The affinity of this antibody for EGFRvIII peptide is 1.7 × 107 M-1 as

  4. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  5. Development of computer-aided auto-ranging technique for a computed radiography system

    International Nuclear Information System (INIS)

    Ishida, M.; Shimura, K.; Nakajima, N.; Kato, H.

    1988-01-01

    For a computed radiography system, the authors developed a computer-aided autoranging technique in which the clinically useful image data are automatically mapped to the available display range. The preread image data are inspected to determine the location of collimation. A histogram of the pixels inside the collimation is evaluated regarding characteristic values such as maxima and minima, and then the optimal density and contrast are derived for the display image. The effect of the autoranging technique was investigated at several hospitals in Japan. The average rate of films lost due to undesirable density or contrast was about 0.5%

  6. Radiological equipment analyzed by specific developed phantoms and software

    International Nuclear Information System (INIS)

    Soto, M.; Campayo, J. M.; Mayo, P.; Verdu, G.; Rodenas, F.

    2010-10-01

    The use of radiographic phantoms specifically designed to evaluate the operation of the radiographic equipment lets the study of the image quality obtained by this equipment in an objective way. In digital radiographic equipment, the analysis of the image quality can be computerized because the acquisition of the image is possible in different technologies that are, computerized radiography or phosphor plate and direct radiography or detector. In case of film-screen equipment s this analysis could be applied digitalising the image in a professional scanner. In this work we have shown an application to assess automatically the constancy quality image in the image chain of the radiographic equipment s. This application is integrated by designed radiographic phantoms which are adapted to conventional, dental equipment s and specific developed software for the automatic evaluation of the phantom image quality. The software is based on digital image processing techniques that let the automatic detection of the different phantom tests by edge detector, morphological operators, threshold histogram techniques... etc. The utility developed is enough sensitive to the radiographic equipment of operating conditions of voltage (kV) and charge (m As). It is a friendly user programme connected with a data base of the hospital or clinic where it has been used. After the phantom image processing the user can obtain an inform with a resume of the imaging system state with accepting and constancy results. (Author)

  7. Radiological equipment analyzed by specific developed phantoms and software

    Energy Technology Data Exchange (ETDEWEB)

    Soto, M.; Campayo, J. M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, Local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Mayo, P. [TITANIA Servicios Tecnologicos SL, Sorolla Center, Local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Verdu, G.; Rodenas, F., E-mail: m.soto@lainsa.co [ISIRYIM Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain)

    2010-10-15

    The use of radiographic phantoms specifically designed to evaluate the operation of the radiographic equipment lets the study of the image quality obtained by this equipment in an objective way. In digital radiographic equipment, the analysis of the image quality can be computerized because the acquisition of the image is possible in different technologies that are, computerized radiography or phosphor plate and direct radiography or detector. In case of film-screen equipment s this analysis could be applied digitalising the image in a professional scanner. In this work we have shown an application to assess automatically the constancy quality image in the image chain of the radiographic equipment s. This application is integrated by designed radiographic phantoms which are adapted to conventional, dental equipment s and specific developed software for the automatic evaluation of the phantom image quality. The software is based on digital image processing techniques that let the automatic detection of the different phantom tests by edge detector, morphological operators, threshold histogram techniques... etc. The utility developed is enough sensitive to the radiographic equipment of operating conditions of voltage (kV) and charge (m As). It is a friendly user programme connected with a data base of the hospital or clinic where it has been used. After the phantom image processing the user can obtain an inform with a resume of the imaging system state with accepting and constancy results. (Author)

  8. Development of raphe serotonin neurons from specification to guidance.

    Science.gov (United States)

    Kiyasova, Vera; Gaspar, Patricia

    2011-11-01

    The main features of the development of the serotonin (5-HT) raphe neurons have been known for many years but more recent molecular studies, using mouse genetics, have since unveiled several intriguing aspects of the specification of the raphe serotonergic system. These studies indicated that, although all 5-HT neurons in the raphe follow the same general program for their specification, there are also clear regional differences in the way that these neurons are specified and are guided towards different brain targets. Here we overview recent progress made in the understanding of the developmental programming of serotonergic neurons in the mouse raphe, emphasizing data showing how heterogeneous subsets of 5-HT neurons may be generated. Serotonergic progenitors are produced in the brainstem in different rhombomeres under the influence of a set of secreted factors, sonic hedgehog and fibroblast growth factors, which determine their position in the neural tube. Two main transcriptional gene networks are involved in the specification of 5-HT identity, with Lmx1b and Pet1 transcription factors as main players. A differential requirement for Pet1 was, however, revealed, which underlies an anatomical and functional diversity. Transcriptional programs controlling 5-HT identity could also impact axon guidance mechanisms directing 5-HT neurons to their targets. Although no direct links have yet been established, a large set of molecular determinants have already been shown to be involved in the growth, axon guidance and targeting of 5-HT raphe neurons, particularly within the forebrain. Alterations in the molecular mechanisms involved in 5-HT development are likely to have significant roles in mood disease predisposition. © 2011 The Authors. European Journal of Neuroscience © 2011 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  9. Developing ontological model of computational linear algebra - preliminary considerations

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  10. Development of the Shimadzu computed tomographic scanner SCT-200N

    International Nuclear Information System (INIS)

    Ishihara, Hiroshi; Yamaoka, Nobuyuki; Saito, Masahiro

    1982-01-01

    The Shimadzu Computed Tomographic Scanner SCT-200N has been developed as an ideal CT scanner for diagnosing the head and spine. Due to the large aperture, moderate scan time and the Zoom Scan Mode, any part of the body can be scanned. High quality image can be obtained by adopting the precisely stabilized X-ray unit and densely packed array of 64-detectors. As for its operation, capability of computed radiography (CR) prior to patient positioning and real time reconstruction ensure efficient patient through-put. Details of the SCT-200N are described in this paper. (author)

  11. Developing Decision-Making Skill: Experiential Learning in Computer Games

    OpenAIRE

    Kurt A. April; Katja M. J. Goebel; Eddie Blass; Jonathan Foster-Pedley

    2012-01-01

    This paper explores the value that computer and video games bring to learning and leadership and explores how games work as learning environments and the impact they have on personal development. The study looks at decisiveness, decision-making ability and styles, and on how this leadership-related skill is learnt through different paradigms. The paper compares the learning from a lecture to the learning from a designed computer game, both of which have the same content through the use of a s...

  12. Computational modeling of the mathematical phantoms of the Brazilian woman to internal dosimetry calculations and for comparison of the absorbed fractions with specific reference women

    International Nuclear Information System (INIS)

    Ximenes, Edmir; Guimaraes, Maria Ines C. C.

    2008-01-01

    The theme of this work is the study of the concept of mathematical dummy - also called phantoms - used in internal dosimetry and radiation protection, from the perspective of computer simulations. In this work he developed the mathematical phantom of the Brazilian woman, to be used as the basis of calculations of Specific Absorbed Fractions (AEDs) in the body's organs and skeleton by virtue of goals with regarding the diagnosis or therapy in nuclear medicine. The phantom now developed is similar, in form, to Snyder phantom making it more realistic for the anthropomorphic conditions of Brazilian women. For so we used the Monte Carlo method of formalism, through computer modeling. As a contribution to the objectives of this study, it was developed and implemented the computer system cFAE - consultation Fraction Specific Absorbed, which makes it versatile for the user's query researcher

  13. Utilizing Computational Probabilistic Methods to Derive Shock Specifications in a Nondeterministic Environment

    Energy Technology Data Exchange (ETDEWEB)

    FIELD JR.,RICHARD V.; RED-HORSE,JOHN R.; PAEZ,THOMAS L.

    2000-10-25

    One of the key elements of the Stochastic Finite Element Method, namely the polynomial chaos expansion, has been utilized in a nonlinear shock and vibration application. As a result, the computed response was expressed as a random process, which is an approximation to the true solution process, and can be thought of as a generalization to solutions given as statistics only. This approximation to the response process was then used to derive an analytically-based design specification for component shock response that guarantees a balanced level of marginal reliability. Hence, this analytically-based reference SRS might lead to an improvement over the somewhat ad hoc test-based reference in the sense that it will not exhibit regions of conservativeness. nor lead to overtesting of the design.

  14. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  15. Left fronto-temporal dynamics during agreement processing: evidence for feature-specific computations.

    Science.gov (United States)

    Molinaro, Nicola; Barber, Horacio A; Pérez, Alejandro; Parkkonen, Lauri; Carreiras, Manuel

    2013-09-01

    Grammatical agreement is a widespread language phenomenon that indicates formal syntactic relations between words; however, it also conveys basic lexical (e.g. grammatical gender) or semantic (e.g. numerosity) information about a discourse referent. In this study, we focus on the reading of Spanish noun phrases, violating either number or gender determiner-noun agreement compared to grammatical controls. Magnetoencephalographic activity time-locked to the onset of the noun in both types of violation revealed a left-lateralized brain network involving anterior temporal regions (~220 ms) and, later in time, ventro-lateral prefrontal regions (>300 ms). These activations coexist with dependency-specific effects: in an initial step (~170 ms), occipito-temporal regions are employed for fine-grained analysis of the number marking (in Spanish, presence or absence of the suffix '-s'), while anterior temporal regions show increased activation for gender mismatches compared to grammatical controls. The semantic relevance of number agreement dependencies was mainly reflected by left superior temporal increased activity around 340 ms. These findings offer a detailed perspective on the multi-level analyses involved in the initial computation of agreement dependencies, and theoretically support a derivational approach to agreement computation. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  17. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  18. Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?

    Directory of Open Access Journals (Sweden)

    Markus A Wenzel

    Full Text Available Brain-computer interfaces (BCIs that are based on event-related potentials (ERPs can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG. Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI, because it would allow software to adapt to the user's interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli.Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions.Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG.The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.

  19. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  20. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  1. Development of computed tomography system and image reconstruction algorithm

    International Nuclear Information System (INIS)

    Khairiah Yazid; Mohd Ashhar Khalid; Azaman Ahmad; Khairul Anuar Mohd Salleh; Ab Razak Hamzah

    2006-01-01

    Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 μm pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)

  2. BrEPS: a flexible and automatic protocol to compute enzyme-specific sequence profiles for functional annotation

    Directory of Open Access Journals (Sweden)

    Schomburg D

    2010-12-01

    Full Text Available Abstract Background Models for the simulation of metabolic networks require the accurate prediction of enzyme function. Based on a genomic sequence, enzymatic functions of gene products are today mainly predicted by sequence database searching and operon analysis. Other methods can support these techniques: We have developed an automatic method "BrEPS" that creates highly specific sequence patterns for the functional annotation of enzymes. Results The enzymes in the UniprotKB are identified and their sequences compared against each other with BLAST. The enzymes are then clustered into a number of trees, where each tree node is associated with a set of EC-numbers. The enzyme sequences in the tree nodes are aligned with ClustalW. The conserved columns of the resulting multiple alignments are used to construct sequence patterns. In the last step, we verify the quality of the patterns by computing their specificity. Patterns with low specificity are omitted and recomputed further down in the tree. The final high-quality patterns can be used for functional annotation. We ran our protocol on a recent Swiss-Prot release and show statistics, as well as a comparison to PRIAM, a probabilistic method that is also specialized on the functional annotation of enzymes. We determine the amount of true positive annotations for five common microorganisms with data from BRENDA and AMENDA serving as standard of truth. BrEPS is almost on par with PRIAM, a fact which we discuss in the context of five manually investigated cases. Conclusions Our protocol computes highly specific sequence patterns that can be used to support the functional annotation of enzymes. The main advantages of our method are that it is automatic and unsupervised, and quite fast once the patterns are evaluated. The results show that BrEPS can be a valuable addition to the reconstruction of metabolic networks.

  3. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions.

    Science.gov (United States)

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600-700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In

  4. Development of Organ-Specific Donor Risk Indices

    Science.gov (United States)

    Akkina, Sanjeev K.; Asrani, Sumeet K.; Peng, Yi; Stock, Peter; Kim, Ray; Israni, Ajay K.

    2012-01-01

    Due to the shortage of deceased donor organs, transplant centers accept organs from marginal deceased donors, including older donors. Organ-specific donor risk indices have been developed to predict graft survival using various combinations of donor and recipient characteristics. We will review the kidney donor risk index (KDRI) and liver donor risk index (LDRI) and compare and contrast their strengths, limitations, and potential uses. The Kidney Donor Risk Index has a potential role in developing new kidney allocation algorithms. The Liver Donor Risk Index allows for greater appreciation of the importance of donor factors, particularly for hepatitis C-positive recipients; as the donor risk index increases, rates of allograft and patient survival among these recipients decrease disproportionately. Use of livers with high donor risk index is associated with increased hospital costs independent of recipient risk factors, and transplanting livers with high donor risk index into patients with Model for End-Stage Liver Disease scores Donor Risk Index has limited this practice. Significant regional variation in donor quality, as measured by the Liver Donor Risk Index, remains in the United States. We also review other potential indices for liver transplant, including donor-recipient matching and the retransplant donor risk index. While substantial progress has been made in developing donor risk indices to objectively assess donor variables that affect transplant outcomes, continued efforts are warranted to improve these indices to enhance organ allocation policies and optimize allograft survival. PMID:22287036

  5. Development of organ-specific donor risk indices.

    Science.gov (United States)

    Akkina, Sanjeev K; Asrani, Sumeet K; Peng, Yi; Stock, Peter; Kim, W Ray; Israni, Ajay K

    2012-04-01

    Because of the shortage of deceased donor organs, transplant centers accept organs from marginal deceased donors, including older donors. Organ-specific donor risk indices have been developed to predict graft survival with various combinations of donor and recipient characteristics. Here we review the kidney donor risk index (KDRI) and the liver donor risk index (LDRI) and compare and contrast their strengths, limitations, and potential uses. The KDRI has a potential role in developing new kidney allocation algorithms. The LDRI allows a greater appreciation of the importance of donor factors, particularly for hepatitis C virus-positive recipients; as the donor risk index increases, the rates of allograft and patient survival among these recipients decrease disproportionately. The use of livers with high donor risk indices is associated with increased hospital costs that are independent of recipient risk factors, and the transplantation of livers with high donor risk indices into patients with Model for End-Stage Liver Disease scores indices for liver transplantation, including donor-recipient matching and the retransplant donor risk index. Although substantial progress has been made in developing donor risk indices to objectively assess donor variables that affect transplant outcomes, continued efforts are warranted to improve these indices to enhance organ allocation policies and optimize allograft survival. Copyright © 2012 American Association for the Study of Liver Diseases.

  6. The Experiment Method for Manufacturing Grid Development on Single Computer

    Institute of Scientific and Technical Information of China (English)

    XIAO Youan; ZHOU Zude

    2006-01-01

    In this paper, an experiment method for the Manufacturing Grid application system development in the single personal computer environment is proposed. The characteristic of the proposed method is constructing a full prototype Manufacturing Grid application system which is hosted on a single personal computer with the virtual machine technology. Firstly, it builds all the Manufacturing Grid physical resource nodes on an abstraction layer of a single personal computer with the virtual machine technology. Secondly, all the virtual Manufacturing Grid resource nodes will be connected with virtual network and the application software will be deployed on each Manufacturing Grid nodes. Then, we can obtain a prototype Manufacturing Grid application system which is working in the single personal computer, and can carry on the experiment on this foundation. Compared with the known experiment methods for the Manufacturing Grid application system development, the proposed method has the advantages of the known methods, such as cost inexpensively, operation simple, and can get the confidence experiment result easily. The Manufacturing Grid application system constructed with the proposed method has the high scalability, stability and reliability. It is can be migrated to the real application environment rapidly.

  7. Beyond computer literacy: supporting youth's positive development through technology.

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.

  8. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  9. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  10. The level 1 and 2 specification for parallel benchmark and a benchmark test of scalar-parallel computer SP2 based on the specifications

    International Nuclear Information System (INIS)

    Orii, Shigeo

    1998-06-01

    A benchmark specification for performance evaluation of parallel computers for numerical analysis is proposed. Level 1 benchmark, which is a conventional type benchmark using processing time, measures performance of computers running a code. Level 2 benchmark proposed in this report is to give the reason of the performance. As an example, scalar-parallel computer SP2 is evaluated with this benchmark specification in case of a molecular dynamics code. As a result, the main causes to suppress the parallel performance are maximum band width and start-up time of communication between nodes. Especially the start-up time is proportional not only to the number of processors but also to the number of particles. (author)

  11. Services for domain specific developments in the Cloud

    Science.gov (United States)

    Schwichtenberg, Horst; Gemuend, André

    2015-04-01

    We will discuss and demonstrate the possibilities of new Cloud Services where the complete development of code is in the Cloud. We will discuss the possibilities of such services where the complete development cycle from programing to testing is in the cloud. This can be also combined with dedicated research domain specific services and hide the burden of accessing available infrastructures. As an example, we will show a service that is intended to complement the services of the VERCE projects infrastructure, a service that utilizes Cloud resources to offer simplified execution of data pre- and post-processing scripts. It offers users access to the ObsPy seismological toolbox for processing data with the Python programming language, executed on virtual Cloud resources in a secured sandbox. The solution encompasses a frontend with a modern graphical user interface, a messaging infrastructure as well as Python worker nodes for background processing. All components are deployable in the Cloud and have been tested on different environments based on OpenStack and OpenNebula. Deployments on commercial, public Clouds will be tested in the future.

  12. Development of a computer-based pulsed NMR thermometer

    International Nuclear Information System (INIS)

    Hobeika, Alexandre; Haard, T.M.; Hoskinson, E.M.; Packard, R.E.

    2003-01-01

    We have designed a fully computer-controlled pulsed NMR system, using the National Instruments PCI-6115 data acquisition board. We use it for millikelvin thermometry and have developed a special control program, written in LabVIEW, for this purpose. It can perform measurements of temperature via the susceptibility or the τ 1 dependence. This system requires little hardware, which makes it very versatile, easily reproducible and customizable

  13. Automating Commercial Video Game Development using Computational Intelligence

    OpenAIRE

    Tse G. Tan; Jason Teo; Patricia Anthony

    2011-01-01

    Problem statement: The retail sales of computer and video games have grown enormously during the last few years, not just in United States (US), but also all over the world. This is the reason a lot of game developers and academic researchers have focused on game related technologies, such as graphics, audio, physics and Artificial Intelligence (AI) with the goal of creating newer and more fun games. In recent years, there has been an increasing interest in game AI for pro...

  14. Development of distributed computer systems for future nuclear power plants

    International Nuclear Information System (INIS)

    Yan, G.; L'Archeveque, J.V.R.

    1978-01-01

    Dual computers have been used for direct digital control in CANDU power reactors since 1963. However, as reactor plants have grown in size and complexity, some drawbacks to centralized control appear such as, for example, the surprisingly large amount of cabling required for information transmission. Dramatic changes in costs of components and a desire to improve system performance have stimulated a broad-based research and development effort in distribution systems. This paper outlines work in this area

  15. Multilink manipulator computer control: experience in development and commissioning

    International Nuclear Information System (INIS)

    Holt, J.E.

    1988-11-01

    This report describes development which has been carried out on the multilink manipulator computer control system. The system allows the manipulator to be driven using only two joysticks. The leading link is controlled and the other links follow its path into the reactor, thus avoiding any potential obstacles. The system has been fully commissioned and used with the Sizewell ''A'' reactor 2 Multilink T.V. manipulator. Experience of the use of the system is presented, together with recommendations for future improvements. (author)

  16. Development of a Very Dense Liquid Cooled Compute Platform

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  17. The influence of playing computer games on pupil's development

    OpenAIRE

    Pospíšilová, Lenka

    2008-01-01

    This thesis is about the effects of playing computer games on pupils and students behavior. It is divided into a theoretical and an investigative part. The theoretical part is dedicated to historical development of technologies and principals of game systems in relationship to technical progress. It adverts to psychological, social and biological effects of long time, intensive playing of games. It shows positive and negative effects ofthis activity. The work analyses typical pathological eve...

  18. Present status of computational tools for maglev development

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  19. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  20. Development of Nuclear Plant Specific Analysis Simulators with ATLAS

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Draeger, P.; Horche, W.; Pointner, W.

    2006-01-01

    The simulation software ATLAS, based on the best-estimate code ATHLET, has been developed by the GRS for a range of applications in the field of nuclear plant safety analysis. Through application of versatile simulation tools and graphical interfaces the user should be able to analyse with ATLAS all essential accident scenarios. Detailed analysis simulators for several German and Russian NPPs are being constructed on the basis of ATLAS. An overview of the ATLAS is presented in the paper, describing its configuration, functions performed by main components and relationships among them. A significant part of any power plant simulator are the balance-of-plant (BOP) models, not only because all the plant transients and non-LOCA accidents can be initiated by operation of BOP systems, but also because the response of the plant to transients or accidents is strongly influenced by the automatic operation of BOP systems. Modelling aspects of BOP systems are shown in detail, also the interface between the process model and BOP systems. Special emphasis has been put on the BOP model builder based on the methodology developed in the GRS. The BOP modeler called GCSM-Generator is an object oriented tool which runs on the online expert system G2. It is equipped with utilities to edit the BOP models, to verification them and to generate a GCSM code, specific for the ATLAS. The communication system of ATLAS presents graphically the results of the simulation and allows interactively influencing the execution of the simulation process (malfunctions, manual control). Displays for communications with simulated processes and presentation of calculations results are also presented. In the framework of the verification of simulation models different tools are used e.g. the PC-codes MATHCAD for the calculation and documentation, ATLET-Input-Graphic for control of geometry data and the expert system G2 for development of BOP-Models. The validation procedure and selected analyses results

  1. The Computational Development of Reinforcement Learning during Adolescence.

    Directory of Open Access Journals (Sweden)

    Stefano Palminteri

    2016-06-01

    Full Text Available Adolescence is a period of life characterised by changes in learning and decision-making. Learning and decision-making do not rely on a unitary system, but instead require the coordination of different cognitive processes that can be mathematically formalised as dissociable computational modules. Here, we aimed to trace the developmental time-course of the computational modules responsible for learning from reward or punishment, and learning from counterfactual feedback. Adolescents and adults carried out a novel reinforcement learning paradigm in which participants learned the association between cues and probabilistic outcomes, where the outcomes differed in valence (reward versus punishment and feedback was either partial or complete (either the outcome of the chosen option only, or the outcomes of both the chosen and unchosen option, were displayed. Computational strategies changed during development: whereas adolescents' behaviour was better explained by a basic reinforcement learning algorithm, adults' behaviour integrated increasingly complex computational features, namely a counterfactual learning module (enabling enhanced performance in the presence of complete feedback and a value contextualisation module (enabling symmetrical reward and punishment learning. Unlike adults, adolescent performance did not benefit from counterfactual (complete feedback. In addition, while adults learned symmetrically from both reward and punishment, adolescents learned from reward but were less likely to learn from punishment. This tendency to rely on rewards and not to consider alternative consequences of actions might contribute to our understanding of decision-making in adolescence.

  2. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    Science.gov (United States)

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  3. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  4. Development of antifertility vaccine using sperm specific proteins

    Directory of Open Access Journals (Sweden)

    A H Bandivdekar

    2014-01-01

    Full Text Available Sperm proteins are known to be associated with normal fertilization as auto- or iso-antibodies to these proteins may cause infertility. Therefore, sperm proteins have been considered to be the potential candidate for the development of antifertility vaccine. Some of the sperm proteins proved to be promising antigens for contraceptive vaccine includes lactate dehydrogenase (LDH-C4, protein hyaluronidase (PH-20, and Eppin. Immunization with LDH-C4 reduced fertility in female baboons but not in female cynomolgus macaques. Active immunization with PH-20 resulted in 100 per cent inhibition of fertility in male guinea pigs but it induced autoimmune orchitis. Immunization with Eppin elicited high antibody titres in 78 per cent of immunized monkeys and induced infertility but the immunopathological effect of immunization was not examined. Human sperm antigen (80kDa HSA is a sperm specific, highly immunogenic and conserved sperm protein. Active immunization with 80kDa HSA induced immunological infertility in male and female rats. Partial N-terminal amino acid sequence of 80kDa HSA (Peptide NT and its peptides (Peptides 1, 2, 3 and 4 obtained by enzymatic digestion did not show homology with any of the known proteins in gene bank. Peptides NT, 1, 2 and 4 were found to mimic immunobiological activity of native protein. Passive administration of antibodies to peptides NT, 1, 2 and 4 induced infertility in male and female rats and peptide 1 was found to be most effective in suppressing fertility. Active immunization with keyhole limpet haemocynin (KLH conjugated synthetic peptide 1 impaired fertility in all the male rabbits and six of the seven male marmosets. The fertility was restored following decline in antibody titre. All these findings on 80kDA HAS suggest that the synthetic Peptide-1 of 80kDa HSA is the promising candidate for development of male contraceptive vaccine.

  5. The development of AR book for computer learning

    Science.gov (United States)

    Phadung, Muneeroh; Wani, Najela; Tongmnee, Nur-aiynee

    2017-08-01

    Educators need to provide the alternative educational tools to foster learning outcomes of students. By using AR technology to create exciting edutainment experiences, this paper presents how augmented reality (AR) can be applied in the education. This study aims to develop the AR book for tenth grade students (age 15-16) and evaluate its quality. The AR book was developed based on ADDIE framework processes to provide computer learning on software computer knowledge. The content was accorded with the current Thai education curriculum. The AR book had 10 pages in three topics (the first was "Introduction," the second was "System Software" and the third was "Application Software"). Each page contained markers that placed virtual objects (2D animation and video clip). The obtained data were analyzed in terms of average and standard deviation. The validity of multimedia design of the AR book was assessed by three experts in multimedia design. A five-point Likert scale was used and the values were X¯ =4 .84 , S.D. = 1.27 which referred to very high. Moreover, three content experts, who specialize in computer teaching, evaluated the AR book's validity. The values determined by the experts were X¯ =4 .69 , S.D. = 0.29 which referred to very high. Implications for future study and education are discussed.

  6. Development of Wagle Health-Specific Religiousness scale.

    Science.gov (United States)

    Wagle, Ann M; Champion, Victoria L; Russell, Kathleen M; Rawl, Susan M

    2009-01-01

    African American women have a lower rate of regular mammography screening, resulting in higher incidence of advanced-stage breast cancer at diagnosis and a lower 5-year survival rate as compared with white women. Researchers have demonstrated that several health beliefs relate to mammography screening in African American women, but little attention has been paid to the importance of religiousness. Although some authors have attempted to determine a link between religiousness and health, we lack a valid and reliable instrument to measure religiousness that can be found in the context of health behaviors. The purpose of this article is to describe the development and psychometric testing of the Wagle Health-Specific Religiousness (WHSR) scale, an instrument used to measure religious beliefs and the influence of those beliefs on mammography screening for African American women. A sample of 344 low-income African American women who were nonadherent to mammography at accrual participating in a randomized trial completed the WHSR. Data from this trial were used to determine the validity and reliability of the WHSR. The 19-item WHSR scale had a Cronbach alpha of. 94. Construct validity was supported via factor analysis and analysis of theoretical relationships. Although further testing is warranted, this analysis indicates that the concept of religiousness is an important component of mammography behavior in African American women.

  7. Development of a specific geological mapping software under MAPGIS

    International Nuclear Information System (INIS)

    Zhang Wenkai

    2010-01-01

    The most often used mapping software in geological exploration is MAPGIS system, and related standard is established based on it. The software has more agile functions, except for the following shortages: more parameters to select, difficult to master, different parameters to use for each one, low efficiency. As a result, a specific software is developed for geological mapping by using VC++ on the platform of MAPGIS. According to the standards, toolbars are built for strata, rock, geographic information and materials, etc. By pushing on the buttons, the parameters are selected, and menus of toolbars can be modified to select parameters for each working areas, legends can be sorted automatically. So, the speed can be improved greatly, and the parameters can be identical. The software can complete the transition between Gauss coordinate and longitude-latitude coordinate, drawing points, frames by longitude-latitude, responsible form, plain diagram and profile, etc. The software also improves the way of clipping, topologizing, node catching methods. The application of the software indicates that it can improve the speed of geological mapping greatly, and can improve the standardized level of the final maps. (authors)

  8. Subject-specific computational modeling of DBS in the PPTg area

    Directory of Open Access Journals (Sweden)

    Laura M. Zitella

    2015-07-01

    Full Text Available Deep brain stimulation (DBS in the pedunculopontine tegmental nucleus (PPTg has been proposed to alleviate medically intractable gait difficulties associated with Parkinson’s disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20Hz, 90µs pulse width was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4mA. These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts, which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.

  9. Development of a Computer Application to Simulate Porous Structures

    Directory of Open Access Journals (Sweden)

    S.C. Reis

    2002-09-01

    Full Text Available Geometric modeling is an important tool to evaluate structural parameters as well as to follow the application of stereological relationships. The obtention, visualization and analysis of volumetric images of the structure of materials, using computational geometric modeling, facilitates the determination of structural parameters of difficult experimental access, such as topological and morphological parameters. In this work, we developed a geometrical model implemented by computer software that simulates random pore structures. The number of nodes, number of branches (connections between nodes and the number of isolated parts, are obtained. Also, the connectivity (C is obtained from this application. Using a list of elements, nodes and branches, generated by the software, in AutoCAD® command line format, the obtained structure can be viewed and analyzed.

  10. Development of a proton Computed Tomography Detector System

    Energy Technology Data Exchange (ETDEWEB)

    Naimuddin, Md. [Delhi U.; Coutrakon, G. [Northern Illinois U.; Blazey, G. [Northern Illinois U.; Boi, S. [Northern Illinois U.; Dyshkant, A. [Northern Illinois U.; Erdelyi, B. [Northern Illinois U.; Hedin, D. [Northern Illinois U.; Johnson, E. [Northern Illinois U.; Krider, J. [Northern Illinois U.; Rukalin, V. [Northern Illinois U.; Uzunyan, S. A. [Northern Illinois U.; Zutshi, V. [Northern Illinois U.; Fordt, R. [Fermilab; Sellberg, G. [Fermilab; Rauch, J. E. [Fermilab; Roman, M. [Fermilab; Rubinov, P. [Fermilab; Wilson, P. [Fermilab

    2016-02-04

    Computer tomography is one of the most promising new methods to image abnormal tissues inside the human body. Tomography is also used to position the patient accurately before radiation therapy. Hadron therapy for treating cancer has become one of the most advantegeous and safe options. In order to fully utilize the advantages of hadron therapy, there is a necessity of performing radiography with hadrons as well. In this paper we present the development of a proton computed tomography system. Our second-generation proton tomography system consists of two upstream and two downstream trackers made up of fibers as active material and a range detector consisting of plastic scintillators. We present details of the detector system, readout electronics, and data acquisition system as well as the commissioning of the entire system. We also present preliminary results from the test beam of the range detector.

  11. Developing Activities for Teaching Cloud Computing and Virtualization

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2014-10-01

    Full Text Available Cloud computing and virtualization are new but indispensable components of computer engineering and information systems curricula for universities and higher education institutions. Learning about these topics is important for students preparing to work in the IT industry. In many companies, information technology operates under tight financial constraints. Virtualization, (for example storage, desktop, and server virtualization, reduces overall IT costs through the consolidation of systems. It also results in reduced loads and energy savings in terms of the power and cooling infrastructure. Therefore it is important to investigate the practical aspects of this topic both for industry practice and for teaching purposes. This paper demonstrates some activities undertaken recently by students at the Eastern Institute of Technology New Zealand and concludes with general recommendations for IT educators, software developers, and other IT professionals

  12. Inference of Cancer-specific Gene Regulatory Networks Using Soft Computing Rules

    Directory of Open Access Journals (Sweden)

    Xiaosheng Wang

    2010-03-01

    Full Text Available Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  13. Inference of cancer-specific gene regulatory networks using soft computing rules.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2010-03-24

    Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer) using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  14. GSTARS computer models and their applications, part I: theoretical development

    Science.gov (United States)

    Yang, C.T.; Simoes, F.J.M.

    2008-01-01

    GSTARS is a series of computer models developed by the U.S. Bureau of Reclamation for alluvial river and reservoir sedimentation studies while the authors were employed by that agency. The first version of GSTARS was released in 1986 using Fortran IV for mainframe computers. GSTARS 2.0 was released in 1998 for personal computer application with most of the code in the original GSTARS revised, improved, and expanded using Fortran IV/77. GSTARS 2.1 is an improved and revised GSTARS 2.0 with graphical user interface. The unique features of all GSTARS models are the conjunctive use of the stream tube concept and of the minimum stream power theory. The application of minimum stream power theory allows the determination of optimum channel geometry with variable channel width and cross-sectional shape. The use of the stream tube concept enables the simulation of river hydraulics using one-dimensional numerical solutions to obtain a semi-two- dimensional presentation of the hydraulic conditions along and across an alluvial channel. According to the stream tube concept, no water or sediment particles can cross the walls of stream tubes, which is valid for many natural rivers. At and near sharp bends, however, sediment particles may cross the boundaries of stream tubes. GSTARS3, based on FORTRAN 90/95, addresses this phenomenon and further expands the capabilities of GSTARS 2.1 for cohesive and non-cohesive sediment transport in rivers and reservoirs. This paper presents the concepts, methods, and techniques used to develop the GSTARS series of computer models, especially GSTARS3. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  15. Computable general equilibrium model fiscal year 2013 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rivera, Michael Kelly [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-17

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  16. The computer code SEURBNUK/EURDYN (Release 1). Input and output specification

    International Nuclear Information System (INIS)

    Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK/EURODYN is an extension of SEURBNUK-2, a two dimensional, axisymmetric, Eulerian, finite element containment code in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin and thick structures. These codes are designed to model the hydrodynamic development in time of a hypothetical core disruptive accident (HCDA) in a fast breeder reactor. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations, with information on output facilities, and aid to users to avoid some common difficulties. (UK)

  17. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach

    Science.gov (United States)

    Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London. PMID:26609369

  18. Development of Student Information Management System based on Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    Ibrahim A. ALAMERI

    2017-10-01

    Full Text Available The management and provision of information about the educational process is an essential part of effective management of the educational process in the institutes of higher education. In this paper the requirements of a reliable student management system are analyzed, formed a use-case model of student information management system, designed and implemented the architecture of the application. Regarding the implementation process, modern approaches were used to develop and deploy a reliable online application in cloud computing environments specifically.

  19. Development of an On-Line Surgeon-Specific Operating Room Time Prediction System (Experience with the Michigan Surgical Monitors)

    OpenAIRE

    Brown, Allan C.D.; Schmidt, Nancy M.

    1984-01-01

    The development of a micro-computer application for the on-line prediction of surgeon-specific operating room time using an IBM - PCXT is described. The reasons leading to the project, together with an assessment of the Condor 20 relational database management system as the basis for the application are discussed.

  20. A Combined Experimental and Computational Approach to Subject-Specific Analysis of Knee Joint Laxity

    Science.gov (United States)

    Harris, Michael D.; Cyr, Adam J.; Ali, Azhar A.; Fitzpatrick, Clare K.; Rullkoetter, Paul J.; Maletsky, Lorin P.; Shelburne, Kevin B.

    2016-01-01

    Modeling complex knee biomechanics is a continual challenge, which has resulted in many models of varying levels of quality, complexity, and validation. Beyond modeling healthy knees, accurately mimicking pathologic knee mechanics, such as after cruciate rupture or meniscectomy, is difficult. Experimental tests of knee laxity can provide important information about ligament engagement and overall contributions to knee stability for development of subject-specific models to accurately simulate knee motion and loading. Our objective was to provide combined experimental tests and finite-element (FE) models of natural knee laxity that are subject-specific, have one-to-one experiment to model calibration, simulate ligament engagement in agreement with literature, and are adaptable for a variety of biomechanical investigations (e.g., cartilage contact, ligament strain, in vivo kinematics). Calibration involved perturbing ligament stiffness, initial ligament strain, and attachment location until model-predicted kinematics and ligament engagement matched experimental reports. Errors between model-predicted and experimental kinematics averaged ligaments agreed with literature descriptions. These results demonstrate the ability of our constraint models to be customized for multiple individuals and simultaneously call attention to the need to verify that ligament engagement is in good general agreement with literature. To facilitate further investigations of subject-specific or population based knee joint biomechanics, data collected during the experimental and modeling phases of this study are available for download by the research community. PMID:27306137

  1. Computational techniques used in the development of coprocessing flowsheets

    International Nuclear Information System (INIS)

    Groenier, W.S.; Mitchell, A.D.; Jubin, R.T.

    1979-01-01

    The computer program SEPHIS, developed to aid in determining optimum solvent extraction conditions for the reprocessing of nuclear power reactor fuels by the Purex method, is described. The program employs a combination of approximate mathematical equilibrium expressions and a transient, stagewise-process calculational method to allow stage and product-stream concentrations to be predicted with accuracy and reliability. The possible applications to inventory control for nuclear material safeguards, nuclear criticality analysis, and process analysis and control are of special interest. The method is also applicable to other counntercurrent liquid--liquid solvent extraction processes having known chemical kinetics, that may involve multiple solutes and are performed in conventional contacting equipment

  2. Development and validation of Monte Carlo dose computations for contrast-enhanced stereotactic synchrotron radiation therapy

    International Nuclear Information System (INIS)

    Vautrin, M.

    2011-01-01

    Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr

  3. Development of a computational technique to measure cartilage contact area.

    Science.gov (United States)

    Willing, Ryan; Lapner, Michael; Lalone, Emily A; King, Graham J W; Johnson, James A

    2014-03-21

    Computational measurement of joint contact distributions offers the benefit of non-invasive measurements of joint contact without the use of interpositional sensors or casting materials. This paper describes a technique for indirectly measuring joint contact based on overlapping of articular cartilage computer models derived from CT images and positioned using in vitro motion capture data. The accuracy of this technique when using the physiological nonuniform cartilage thickness distribution, or simplified uniform cartilage thickness distributions, is quantified through comparison with direct measurements of contact area made using a casting technique. The efficacy of using indirect contact measurement techniques for measuring the changes in contact area resulting from hemiarthroplasty at the elbow is also quantified. Using the physiological nonuniform cartilage thickness distribution reliably measured contact area (ICC=0.727), but not better than the assumed bone specific uniform cartilage thicknesses (ICC=0.673). When a contact pattern agreement score (s(agree)) was used to assess the accuracy of cartilage contact measurements made using physiological nonuniform or simplified uniform cartilage thickness distributions in terms of size, shape and location, their accuracies were not significantly different (p>0.05). The results of this study demonstrate that cartilage contact can be measured indirectly based on the overlapping of cartilage contact models. However, the results also suggest that in some situations, inter-bone distance measurement and an assumed cartilage thickness may suffice for predicting joint contact patterns. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Transit safety retrofit package development : architecture and design specifications.

    Science.gov (United States)

    2014-05-01

    The Architecture and Design Specifications capture the TRP system architecture and design that fulfills the technical : objectives stated in the TRP requirements document. : The document begins with an architectural overview that identifies and descr...

  5. Development of computational small animal models and their applications in preclinical imaging and therapy research

    NARCIS (Netherlands)

    Xie, Tianwu; Zaidi, Habib

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal

  6. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  7. Risk-based technical specifications: Development and application of an approach to the generation of a plant specific real-time risk model

    International Nuclear Information System (INIS)

    Puglia, B.; Gallagher, D.; Amico, P.; Atefi, B.

    1992-10-01

    This report describes a process developed to convert an existing PRA into a model amenable to real time, risk-based technical specification calculations. In earlier studies (culminating in NUREG/CR-5742), several risk-based approaches to technical specification were evaluated. A real-time approach using a plant specific PRA capable of modeling plant configurations as they change was identified as the most comprehensive approach to control plant risk. A master fault tree logic model representative of-all of the core damage sequences was developed. Portions of the system fault trees were modularized and supercomponents comprised of component failures with similar effects were developed to reduce the size of the model and, quantification times. Modifications to the master fault tree logic were made to properly model the effect of maintenance and recovery actions. Fault trees representing several actuation systems not modeled in detail in the existing PRA were added to the master fault tree logic. This process was applied to the Surry NUREG-1150 Level 1 PRA. The master logic mode was confirmed. The model was then used to evaluate frequency associated with several plant configurations using the IRRAS code. For all cases analyzed computational time was less than three minutes. This document Volume 2, contains appendices A, B, and C. These provide, respectively: Surry Technical Specifications Model Database, Surry Technical Specifications Model, and a list of supercomponents used in the Surry Technical Specifications Model

  8. New Developments and Geoscience Applications of Synchrotron Computed Microtomography (Invited)

    Science.gov (United States)

    Rivers, M. L.; Wang, Y.; Newville, M.; Sutton, S. R.; Yu, T.; Lanzirotti, A.

    2013-12-01

    Computed microtomography is the extension to micron spatial resolution of the CAT scanning technique developed for medical imaging. Synchrotron sources are ideal for the method, since they provide a monochromatic, parallel beam with high intensity. High energy storage rings such as the Advanced Photon Source at Argonne National Laboratory produce x-rays with high energy, high brilliance, and high coherence. All of these factors combine to produce an extremely powerful imaging tool for earth science research. Techniques that have been developed include: - Absorption and phase contrast computed tomography with spatial resolution below one micron. - Differential contrast computed tomography, imaging above and below the absorption edge of a particular element. - High-pressure tomography, imaging inside a pressure cell at pressures above 10GPa. - High speed radiography and tomography, with 100 microsecond temporal resolution. - Fluorescence tomography, imaging the 3-D distribution of elements present at ppm concentrations. - Radiographic strain measurements during deformation at high confining pressure, combined with precise x-ray diffraction measurements to determine stress. These techniques have been applied to important problems in earth and environmental sciences, including: - The 3-D distribution of aqueous and organic liquids in porous media, with applications in contaminated groundwater and petroleum recovery. - The kinetics of bubble formation in magma chambers, which control explosive volcanism. - Studies of the evolution of the early solar system from 3-D textures in meteorites - Accurate crystal size distributions in volcanic systems, important for understanding the evolution of magma chambers. - The equation-of-state of amorphous materials at high pressure using both direct measurements of volume as a function of pressure and also by measuring the change x-ray absorption coefficient as a function of pressure. - The location and chemical speciation of toxic

  9. Development of a fast running accident analysis computer program for use in a simulator

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    1985-01-01

    This paper describes how a reactor safety nuclear computer program can be modified and improved with the aim of reaching a very fast running tool to be used as a physical model in a plant simulator, without penalizing the accuracy of results. It also discusses some ideas on how the physical theoretical model can be combined to a driving statistical tool for the build up of the entire package of software to be implemented in the simulator for risk and reliability analysis. The approach to the problem, although applied to a specific computer program, can be considered quite general if an already existing and well tested code is being used for the purpose. The computer program considered is ALMOD, originally developed for the analysis of the thermohydraulic and neutronic behaviour of the reactor core, primary circuit and steam generator during operational and special transients. (author)

  10. Individual Stochastic Screening for the Development of Computer Graphics

    Directory of Open Access Journals (Sweden)

    Maja Turčić¹*

    2012-12-01

    Full Text Available With the emergence of new tools and media, art and design have developed into digital computer-generated works. This article presents a sequence of creating art graphics because their original authors have not published the procedures. The goal is to discover the mathematics of an image and the programming libretto with the purpose of organizing a structural base of computer graphics. We will elaborate the procedures used to produce graphics known throughout the history of art, but that are nowadays also found in design and security graphics. The results are closely related graphics obtained by changing parameters that initiate them. The aim is to control the graphics, i.e. to use controlled stochastic to achieve desired solutions. Since the artists from the past have never published the procedures of screening methods, their ideas have remained “only” the works of art. In this article we will present the development of the algorithm that, more or less successfully, simulates those screening solutions. It has been proven that mathematically defined graphical elements serve as screening elements. New technological and mathematical solutions are introduced in the reproduction with individual screening elements to be used in printing.

  11. Fluorescent x-ray computed tomography to visualize specific material distribution

    Science.gov (United States)

    Takeda, Tohoru; Yuasa, Tetsuya; Hoshino, Atsunori; Akiba, Masahiro; Uchida, Akira; Kazama, Masahiro; Hyodo, Kazuyuki; Dilmanian, F. Avraham; Akatsuka, Takao; Itai, Yuji

    1997-10-01

    Fluorescent x-ray computed tomography (FXCT) is being developed to detect non-radioactive contrast materials in living specimens. The FXCT systems consists of a silicon channel cut monochromator, an x-ray slit and a collimator for detection, a scanning table for the target organ and an x-ray detector for fluorescent x-ray and transmission x-ray. To reduce Compton scattering overlapped on the K(alpha) line, incident monochromatic x-ray was set at 37 keV. At 37 keV Monte Carlo simulation showed almost complete separation between Compton scattering and the K(alpha) line. Actual experiments revealed small contamination of Compton scattering on the K(alpha) line. A clear FXCT image of a phantom was obtained. Using this system the minimal detectable dose of iodine was 30 ng in a volume of 1 mm3, and a linear relationship was demonstrated between photon counts of fluorescent x-rays and the concentration of iodine contrast material. The use of high incident x-ray energy allows an increase in the signal to noise ratio by reducing the Compton scattering on the K(alpha) line.

  12. An X-Ray computed tomography/positron emission tomography system designed specifically for breast imaging.

    Science.gov (United States)

    Boone, John M; Yang, Kai; Burkett, George W; Packard, Nathan J; Huang, Shih-ying; Bowen, Spencer; Badawi, Ramsey D; Lindfors, Karen K

    2010-02-01

    Mammography has served the population of women who are at-risk for breast cancer well over the past 30 years. While mammography has undergone a number of changes as digital detector technology has advanced, other modalities such as computed tomography have experienced technological sophistication over this same time frame as well. The advent of large field of view flat panel detector systems enable the development of breast CT and several other niche CT applications, which rely on cone beam geometry. The breast, it turns out, is well suited to cone beam CT imaging because the lack of bones reduces artifacts, and the natural tapering of the breast anteriorly reduces the x-ray path lengths through the breast at large cone angle, reducing cone beam artifacts as well. We are in the process of designing a third prototype system which will enable the use of breast CT for image guided interventional procedures. This system will have several copies fabricated so that several breast CT scanners can be used in a multi-institutional clinical trial to better understand the role that this technology can bring to breast imaging.

  13. Analysis and synthesis of digital circuits for a computer of specific purposes

    International Nuclear Information System (INIS)

    Marchand Rosales, E.E.

    1975-01-01

    The circuits described in this paper are part of a computer system designed for the automation of plasma diagnostics using electrostatic probes. The automated system is designed to give: (a) The density of the plasma (state variable) every ten microseconds in binary digits; (b) Probe data, stored for subsequent diagnostics; (c) A graphic and digital display of results; (d) Presentation of numerical diagnostics results in floating point format and in the decimal system for convenience of interpretation. The project is aimed, furthermore, at the development of techniques for the design, construction and adjustment of digital circuits, and at the training of personnel who will apply these techniques in digital instrumentation. A block diagram of the system is discussed in general terms. Methods for analysis and synthesis of the sequential circuits applied to the circuit for aligning and normalizing the floating point format, the format circuit and the operational sequence circuit are also described. Recommendations are made and precautions suggested which it is thought advisable to follow at the stages of design, construction and adjustment of the digital circuits, and these apply also to the equipment and techniques (wire wrapping) used for building the circuits. The adjustment of the digital circuits proved to be satisfactory and a definition panel was thus obtained for the decimal point alignment circuit. It is concluded that the method of synthesis need not always be applied; the cases in which the method is recommended are mentioned, as are those in which the non-formal method of synthesis can be used. (author)

  14. An Examination of Job-Specific Communication in the Computer Industry.

    Science.gov (United States)

    Kidwell, Michael E.

    Enhancing the awareness of (1) the organization of "typical" computer projects, (2) the communication that emerges from those structures, and (3) the problems that technical communications inherently hold is the purpose of this paper. It begins by presenting the organization of the working group of a computer project (a college that is going to…

  15. Communicative Language Testing: Implications for Computer Based Language Testing in French for Specific Purposes

    Science.gov (United States)

    García Laborda, Jesús; López Santiago, Mercedes; Otero de Juan, Nuria; Álvarez Álvarez, Alfredo

    2014-01-01

    Current evolutions of language testing have led to integrating computers in FSP assessments both in oral and written communicative tasks. This paper deals with two main issues: learners' expectations about the types of questions in FSP computer based assessments and the relation with their own experience. This paper describes the experience of 23…

  16. A Functional Specification for a Programming Language for Computer Aided Learning Applications.

    Science.gov (United States)

    National Research Council of Canada, Ottawa (Ontario).

    In 1972 there were at least six different course authoring languages in use in Canada with little exchange of course materials between Computer Assisted Learning (CAL) centers. In order to improve facilities for producing "transportable" computer based course materials, a working panel undertook the definition of functional requirements of a user…

  17. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  18. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  19. Development of a computational model for astronaut reorientation.

    Science.gov (United States)

    Stirling, Leia; Willcox, Karen; Newman, Dava

    2010-08-26

    The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.

  20. Recent Developments in Computed Tomography for Urolithiasis: Diagnosis and Characterization

    Directory of Open Access Journals (Sweden)

    P. D. Mc Laughlin

    2012-01-01

    Full Text Available Objective. To critically evaluate the current literature in an effort to establish the current role of radiologic imaging, advances in computed tomography (CT and standard film radiography in the diagnosis, and characterization of urinary tract calculi. Conclusion. CT has a valuable role when utilized prudently during surveillance of patients following endourological therapy. In this paper, we outline the basic principles relating to the effects of exposure to ionizing radiation as a result of CT scanning. We discuss the current developments in low-dose CT technology, which have resulted in significant reductions in CT radiation doses (to approximately one-third of what they were a decade ago while preserving image quality. Finally, we will discuss an important recent development now commercially available on the latest generation of CT scanners, namely, dual energy imaging, which is showing promise in urinary tract imaging as a means of characterizing the composition of urinary tract calculi.

  1. Development and numerical analysis of low specific speed mixed-flow pump

    International Nuclear Information System (INIS)

    Li, H F; Huo, Y W; Pan, Z B; Zhou, W C; He, M H

    2012-01-01

    With the development of the city, the market of the mixed flow pump with large flux and high head is prospect. The KSB Shanghai Pump Co., LTD decided to develop low speed specific speed mixed flow pump to meet the market requirements. Based on the centrifugal pump and axial flow pump model, aiming at the characteristics of large flux and high head, a new type of guide vane mixed flow pump was designed. The computational fluid dynamics method was adopted to analyze the internal flow of the new type model and predict its performances. The time-averaged Navier-Stokes equations were closed by SST k-ω turbulent model to adapt internal flow of guide vane with larger curvatures. The multi-reference frame(MRF) method was used to deal with the coupling of rotating impeller and static guide vane, and the SIMPLEC method was adopted to achieve the coupling solution of velocity and pressure. The computational results shows that there is great flow impact on the head of vanes at different working conditions, and there is great flow separation at the tailing of the guide vanes at different working conditions, and all will affect the performance of pump. Based on the computational results, optimizations were carried out to decrease the impact on the head of vanes and flow separation at the tailing of the guide vanes. The optimized model was simulated and its performance was predicted. The computational results show that the impact on the head of vanes and the separation at the tailing of the guide vanes disappeared. The high efficiency of the optimized pump is wide, and it fit the original design destination. The newly designed mixed flow pump is now in modeling and its experimental performance will be getting soon.

  2. Development and numerical analysis of low specific speed mixed-flow pump

    Science.gov (United States)

    Li, H. F.; Huo, Y. W.; Pan, Z. B.; Zhou, W. C.; He, M. H.

    2012-11-01

    With the development of the city, the market of the mixed flow pump with large flux and high head is prospect. The KSB Shanghai Pump Co., LTD decided to develop low speed specific speed mixed flow pump to meet the market requirements. Based on the centrifugal pump and axial flow pump model, aiming at the characteristics of large flux and high head, a new type of guide vane mixed flow pump was designed. The computational fluid dynamics method was adopted to analyze the internal flow of the new type model and predict its performances. The time-averaged Navier-Stokes equations were closed by SST k-ω turbulent model to adapt internal flow of guide vane with larger curvatures. The multi-reference frame(MRF) method was used to deal with the coupling of rotating impeller and static guide vane, and the SIMPLEC method was adopted to achieve the coupling solution of velocity and pressure. The computational results shows that there is great flow impact on the head of vanes at different working conditions, and there is great flow separation at the tailing of the guide vanes at different working conditions, and all will affect the performance of pump. Based on the computational results, optimizations were carried out to decrease the impact on the head of vanes and flow separation at the tailing of the guide vanes. The optimized model was simulated and its performance was predicted. The computational results show that the impact on the head of vanes and the separation at the tailing of the guide vanes disappeared. The high efficiency of the optimized pump is wide, and it fit the original design destination. The newly designed mixed flow pump is now in modeling and its experimental performance will be getting soon.

  3. Computational assessment of effective dose and patient specific doses for kilovoltage stereotactic radiosurgery of wet age-related macular degeneration

    Science.gov (United States)

    Hanlon, Justin Mitchell

    Age-related macular degeneration (AMD) is a leading cause of vision loss and a major health problem for people over the age of 50 in industrialized nations. The current standard of care, ranibizumab, is used to help slow and in some cases stabilize the process of AMD, but requires frequent invasive injections into the eye. Interest continues for stereotactic radiosurgery (SRS), an option that provides a non-invasive treatment for the wet form of AMD, through the development of the IRay(TM) (Oraya Therapeutics, Inc., Newark, CA). The goal of this modality is to destroy choroidal neovascularization beneath the pigment epithelium via delivery of three 100 kVp photon beams entering through the sclera and overlapping on the macula delivering up to 24 Gy of therapeutic dose over a span of approximately 5 minutes. The divergent x-ray beams targeting the fovea are robotically positioned and the eye is gently immobilized by a suction-enabled contact lens. Device development requires assessment of patient effective dose, reference patient mean absorbed doses to radiosensitive tissues, and patient specific doses to the lens and optic nerve. A series of head phantoms, including both reference and patient specific, was derived from CT data and employed in conjunction with the MCNPX 2.5.0 radiation transport code to simulate treatment and evaluate absorbed doses to potential tissues-at-risk. The reference phantoms were used to evaluate effective dose and mean absorbed doses to several radiosensitive tissues. The optic nerve was modeled with changeable positions based on individual patient variability seen in a review of head CT scans gathered. Patient specific phantoms were used to determine the effect of varying anatomy and gaze. The results showed that absorbed doses to the non-targeted tissues were below the threshold levels for serious complications; specifically the development of radiogenic cataracts and radiation induced optic neuropathy (RON). The effective dose

  4. China Refrigerator Information Label: Specification Development and Potential Impact

    Energy Technology Data Exchange (ETDEWEB)

    Fridley, David; Fridley, David; Zheng, Nina; Zhou, Nan; Aden, Nathaniel; Lin, Jiang; Jianhong, Cheng; Sakamoto, Tomoyuki

    2008-02-01

    In the last five years, China's refrigerator market has grown rapidly, and now urban markets are showing signs of saturation, with ownership rates in urban households reaching 92%. Rural markets continue to grow from a much lower base. As a result of this growth, the Chinese government in 2006 decided to revise the refrigerator standards and its associated efficiency grades for the mandatory energy information label. In the Chinese standards process, the efficiency grades for the information label are tied to the minimum standards. Work on the minimum standards revision began in 2006 and continued through the first half of 2007, when the draft standard was completed under the direction of the China National Institute of Standardization (CNIS). Development of the information label grades required consideration of stakeholder input, continuity with the previous grade classification, ease of implementation, and potential impacts on the market. In this process, CLASP, with the support of METI/IEEJ, collaborated with CNIS to develop the efficiency grades, providing technical input to the process, comment and advice on particular technical issues, and evaluation of the results. After three months of effort and three drafts of the final grade specifications, this work was completed. In addition, in order to effectively evaluate the impact of the label on China's market, CLASP further provided assistance to CNIS to collect data on both the efficiency distribution and product volume distribution of refrigerators on the market. The new information label thresholds to be implemented in 2008 maintain the approach first adopted in 2005 of establishing efficiency levels relative to the minimum standard, but increased the related required efficiency levels by 20% over those established in 2003 and implemented in 2005. The focus of improvement was on the standard refrigerator/freezer (class 5), which constitutes the bulk of the Chinese market. Indeed, the new

  5. Development of algorithm for continuous generation of a computer game in terms of usability and optimization of developed code in computer science

    Directory of Open Access Journals (Sweden)

    Tibor Skala

    2018-03-01

    Full Text Available As both hardware and software have become increasingly available and constantly developed, they globally contribute to improvements in technology in every field of technology and arts. Digital tools for creation and processing of graphical contents are very developed and they have been designed to shorten the time required for content creation, which is, in this case, animation. Since contemporary animation has experienced a surge in various visual styles and visualization methods, programming is built-in in everything that is currently in use. There is no doubt that there is a variety of algorithms and software which are the brain and the moving force behind any idea created for a specific purpose and applicability in society. Art and technology combined make a direct and oriented medium for publishing and marketing in every industry, including those which are not necessarily closely related to those that rely heavily on visual aspect of work. Additionally, quality and consistency of an algorithm will also depend on proper integration into the system that will be powered by that algorithm as well as on the way the algorithm is designed. Development of an endless algorithm and its effective use will be shown during the use of the computer game. In order to present the effect of various parameters, in the final phase of the computer game development an endless algorithm was tested with varying number of key input parameters (achieved time, score reached, pace of the game.

  6. Functional requirement specification in the packaging development chain

    NARCIS (Netherlands)

    Lutters, Diederick; ten Klooster, Roland

    2008-01-01

    As it is clear that the full packaging life cycle – at least partially – coincides with the product life cycle, both cycles are interwoven. Each has a network of functional requirements, with specific hierarchic propensities. These networks overlap, with prevailing hierarchies playing important

  7. Isolation of developing secondary xylem specific cellulose synthase ...

    Indian Academy of Sciences (India)

    Total RNA was treated with RNase-free DNase I (Fermentas, USA) ... Twenty-five μL of qRT- PCR reaction included 11.5 μL milli-q water, 200 ng of cDNA, ..... grey shading, class specific region (CSR I and II), plant conserved region (CRP) and.

  8. developing french for specific purposes in the nigerian university ...

    African Journals Online (AJOL)

    Ada

    Following the changing role of the French Language from a vehicle of French culture to a vehicle of Science, Technology, Commerce and Diplomacy, the teaching of French as a foreign language (FFL) around the world has shifted emphasis from literary studies to what has come to be known as French for Specific Purposes ...

  9. Recent development of fluorescent imaging for specific detection of tumors

    International Nuclear Information System (INIS)

    Nakata, Eiji; Morii, Takashi; Uto, Yoshihiro; Hori, Hitoshi

    2011-01-01

    Increasing recent studies on fluorescent imaging for specific detection of tumors are described here on strategies of molecular targeting, metabolic specificity and hypoxic circumstance. There is described an instance of a conjugate of antibody and pH-activable fluorescent ligand, which specifically binds to the tumor cells, is internalized in the cellular lysozomes where their pH is low, and then is activated to become fluorescent only in viable tumor cells. For the case of metabolic specificity, excessive loading of the precursor (5-aminolevulinic acid) of protoporphyrin IX (ppIX), due to their low activity to convert ppIX to heme B, results in making tumors observable in red as ppIX emits fluorescence (red, 585 nm) when excited by blue ray of 410 nm. Similarly, imaging with indocyanine green which is accumulated in hepatoma cells is reported in success in detection of small lesion and metastasis when the dye is administered during operation. Reductive reactions exceed in tumor hypoxic conditions, of which feature is usable for imaging. Conjugates of nitroimidazole and fluorescent dye are reported to successfully image tumors by nitro reduction. Authors' UTX-12 is a non-fluorescent nitroaromatic derivative of pH-sensitive fluorescent dye seminaphtharhodafluor (SNARF), and is designed for the nitro group, the hypoxia-responding sensor, to be reduced in tumor hypoxic conditions and then for the aromatic moiety to be cleaved to release free SNARF. Use of hypoxia-inducible factor-1 (HIF-1) for imaging has been also reported in many. As above, studies on fluorescent imaging for specific detection of tumors are mostly at fundamental step but its future is conceivably promising along with advances in other technology like fluorescent endoscopy and multimodal imaging. (author)

  10. BCILAB: a platform for brain-computer interface development

    Science.gov (United States)

    Kothe, Christian Andreas; Makeig, Scott

    2013-10-01

    Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.

  11. Development of superconductor electronics technology for high-end computing

    Energy Technology Data Exchange (ETDEWEB)

    Silver, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kleinsasser, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kerber, G [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Herr, Q [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Dorojevets, M [Department of Electrical and Computer Engineering, SUNY-Stony Brook, NY 11794-2350 (United States); Bunyk, P [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Abelson, L [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2003-12-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm{sup -2}, 1.25 {mu}m junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s{sup -1}, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.

  12. Development of superconductor electronics technology for high-end computing

    International Nuclear Information System (INIS)

    Silver, A; Kleinsasser, A; Kerber, G; Herr, Q; Dorojevets, M; Bunyk, P; Abelson, L

    2003-01-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm -2 , 1.25 μm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s -1 , both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density

  13. Patient-specific lean body mass can be estimated from limited-coverage computed tomography images.

    Science.gov (United States)

    Devriese, Joke; Beels, Laurence; Maes, Alex; van de Wiele, Christophe; Pottel, Hans

    2018-06-01

    In PET/CT, quantitative evaluation of tumour metabolic activity is possible through standardized uptake values, usually normalized for body weight (BW) or lean body mass (LBM). Patient-specific LBM can be estimated from whole-body (WB) CT images. As most clinical indications only warrant PET/CT examinations covering head to midthigh, the aim of this study was to develop a simple and reliable method to estimate LBM from limited-coverage (LC) CT images and test its validity. Head-to-toe PET/CT examinations were retrospectively retrieved and semiautomatically segmented into tissue types based on thresholding of CT Hounsfield units. LC was obtained by omitting image slices. Image segmentation was validated on the WB CT examinations by comparing CT-estimated BW with actual BW, and LBM estimated from LC images were compared with LBM estimated from WB images. A direct method and an indirect method were developed and validated on an independent data set. Comparing LBM estimated from LC examinations with estimates from WB examinations (LBMWB) showed a significant but limited bias of 1.2 kg (direct method) and nonsignificant bias of 0.05 kg (indirect method). This study demonstrates that LBM can be estimated from LC CT images with no significant difference from LBMWB.

  14. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  15. The Development of University Computing in Sweden 1965-1985

    Science.gov (United States)

    Dahlstrand, Ingemar

    In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.

  16. Hydrochemical characterizatin and water quality monitoring by means of specific computer systems

    International Nuclear Information System (INIS)

    Alvarez, E.; Fagundo, J.R.

    1998-01-01

    The computer systems SAPHIQ and SIMUCIN, for the hydrochemical data process with the aim to characterize and control the water quality, as well as to simulate the water-rock interaction process are presented. (author)

  17. Development of computer-aided diagnosis systems in radiology

    International Nuclear Information System (INIS)

    Higashida, Yoshiharu; Arimura, Hidetaka; Kumazawa, Seiji; Morishita, Junji; Sakai, Shuji

    2006-01-01

    Computer-aided diagnosis (CAD) is a practice done by medical doctors based on computer image analysis as the second opinion, and CAD studies have been government-adopted projects. CAD is already on popular practice in the cancers of the breast by mammography, lung by flat plate and CT images, and large bowel by CT colonoscopy. This paper describes four examples of authors' actual CAD investigations. First, the temporal subtraction image analysis by CAD is for the detection of abnormality in the chest by radiographs taken at different times. Examples are shown in cases of interstitial pneumonia and lung cancer out of 34 patients with diffuse lung diseases. Second, development of CAD system is recorded for detection of aneurysm by the brain MR angiography (MRA). Third is the CAD detection of fascicles in cerebral white matters by the diffuse tensor MRI, which will help the surgery for brain tumors. Final is an automated patient recognition based on an image-matching technique using previous chest radiographs in the picture archiving and communication systems. This is on the radiograph giving biological fingerprints of the patients. CAD will be applied in a wider field of medicare not only in imaging technology. (T.I)

  18. A Brain–Computer Interface for Potential Nonverbal Facial Communication Based on EEG Signals Related to Specific Emotions

    Directory of Open Access Journals (Sweden)

    Koji eKashihara

    2014-08-01

    Full Text Available Unlike assistive technology for verbal communication, the brain–machine or brain–computer interface (BMI/BCI has not been established as a nonverbal communication tool for amyotrophic lateral sclerosis (ALS patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG signals can be used to detect patients’ emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based nonverbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus. This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals.

  19. Unilateral hearing during development: hemispheric specificity in plastic reorganizations

    Directory of Open Access Journals (Sweden)

    Andrej eKral

    2013-11-01

    Full Text Available The present study investigates the hemispheric contributions of neuronal reorganization following early single-sided hearing (unilateral deafness. The experiments were performed on ten cats from our colony of deaf white cats. Two were identified in early hearing screening as unilaterally congenitally deaf. The remaining eight were bilaterally congenitally deaf, unilaterally implanted at different ages with a cochlear implant. Implanted animals were chronically stimulated using a single-channel portable signal processor for two to five months. Microelectrode recordings were performed at the primary auditory cortex under stimulation at the hearing and deaf ear with bilateral cochlear implants. Local field potentials (LFPs were compared at the cortex ipsilateral and contralateral to the hearing ear. The focus of the study was on the morphology and the onset latency of the LFPs. The data revealed that effects of hearing experience were more pronounced when stimulating the hearing ear. With respect to morphology of LFPs, pronounced hemisphere-specific effects were observed. Morphology of amplitude-normalized LFPs for stimulation of the deaf and the hearing ear was similar for responses recorded at the same hemisphere. However, when comparisons were performed between the hemispheres, the morphology was more dissimilar even though the same ear was stimulated. This demonstrates hemispheric specificity of some cortical adaptations irrespective of the ear stimulated. The results suggest a specific adaptation process at the hemisphere ipsilateral to the hearing ear, involving specific (down-regulated inhibitory mechanisms not found in the contralateral hemisphere. Finally, onset latencies revealed that the sensitive period for the cortex ipsilateral to the hearing ear is shorter than that for the contralateral cortex. Unilateral hearing experience leads to a functionally-asymmetric brain with different neuronal reorganizations and different sensitive

  20. Unilateral hearing during development: hemispheric specificity in plastic reorganizations.

    Science.gov (United States)

    Kral, Andrej; Heid, Silvia; Hubka, Peter; Tillein, Jochen

    2013-01-01

    The present study investigates the hemispheric contributions of neuronal reorganization following early single-sided hearing (unilateral deafness). The experiments were performed on ten cats from our colony of deaf white cats. Two were identified in early hearing screening as unilaterally congenitally deaf. The remaining eight were bilaterally congenitally deaf, unilaterally implanted at different ages with a cochlear implant. Implanted animals were chronically stimulated using a single-channel portable signal processor for two to five months. Microelectrode recordings were performed at the primary auditory cortex under stimulation at the hearing and deaf ear with bilateral cochlear implants. Local field potentials (LFPs) were compared at the cortex ipsilateral and contralateral to the hearing ear. The focus of the study was on the morphology and the onset latency of the LFPs. With respect to morphology of LFPs, pronounced hemisphere-specific effects were observed. Morphology of amplitude-normalized LFPs for stimulation of the deaf and the hearing ear was similar for responses recorded at the same hemisphere. However, when comparisons were performed between the hemispheres, the morphology was more dissimilar even though the same ear was stimulated. This demonstrates hemispheric specificity of some cortical adaptations irrespective of the ear stimulated. The results suggest a specific adaptation process at the hemisphere ipsilateral to the hearing ear, involving specific (down-regulated inhibitory) mechanisms not found in the contralateral hemisphere. Finally, onset latencies revealed that the sensitive period for the cortex ipsilateral to the hearing ear is shorter than that for the contralateral cortex. Unilateral hearing experience leads to a functionally-asymmetric brain with different neuronal reorganizations and different sensitive periods involved.

  1. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time

  2. Brain systems for probabilistic and dynamic prediction: computational specificity and integration.

    Directory of Open Access Journals (Sweden)

    Jill X O'Reilly

    2013-09-01

    Full Text Available A computational approach to functional specialization suggests that brain systems can be characterized in terms of the types of computations they perform, rather than their sensory or behavioral domains. We contrasted the neural systems associated with two computationally distinct forms of predictive model: a reinforcement-learning model of the environment obtained through experience with discrete events, and continuous dynamic forward modeling. By manipulating the precision with which each type of prediction could be used, we caused participants to shift computational strategies within a single spatial prediction task. Hence (using fMRI we showed that activity in two brain systems (typically associated with reward learning and motor control could be dissociated in terms of the forms of computations that were performed there, even when both systems were used to make parallel predictions of the same event. A region in parietal cortex, which was sensitive to the divergence between the predictions of the models and anatomically connected to both computational networks, is proposed to mediate integration of the two predictive modes to produce a single behavioral output.

  3. Age- and sex-specific thorax finite element model development and simulation.

    Science.gov (United States)

    Schoell, Samantha L; Weaver, Ashley A; Vavalle, Nicholas A; Stitzel, Joel D

    2015-01-01

    The shape, size, bone density, and cortical thickness of the thoracic skeleton vary significantly with age and sex, which can affect the injury tolerance, especially in at-risk populations such as the elderly. Computational modeling has emerged as a powerful and versatile tool to assess injury risk. However, current computational models only represent certain ages and sexes in the population. The purpose of this study was to morph an existing finite element (FE) model of the thorax to depict thorax morphology for males and females of ages 30 and 70 years old (YO) and to investigate the effect on injury risk. Age- and sex-specific FE models were developed using thin-plate spline interpolation. In order to execute the thin-plate spline interpolation, homologous landmarks on the reference, target, and FE model are required. An image segmentation and registration algorithm was used to collect homologous rib and sternum landmark data from males and females aged 0-100 years. The Generalized Procrustes Analysis was applied to the homologous landmark data to quantify age- and sex-specific isolated shape changes in the thorax. The Global Human Body Models Consortium (GHBMC) 50th percentile male occupant model was morphed to create age- and sex-specific thoracic shape change models (scaled to a 50th percentile male size). To evaluate the thoracic response, 2 loading cases (frontal hub impact and lateral impact) were simulated to assess the importance of geometric and material property changes with age and sex. Due to the geometric and material property changes with age and sex, there were observed differences in the response of the thorax in both the frontal and lateral impacts. Material property changes alone had little to no effect on the maximum thoracic force or the maximum percent compression. With age, the thorax becomes stiffer due to superior rotation of the ribs, which can result in increased bone strain that can increase the risk of fracture. For the 70-YO models

  4. Thermodynamic Molecular Switch in Sequence-Specific Hydrophobic Interaction: Two Computational Models Compared

    Directory of Open Access Journals (Sweden)

    Paul Chun

    2003-01-01

    Full Text Available We have shown in our published work the existence of a thermodynamic switch in biological systems wherein a change of sign in ΔCp°(Treaction leads to a true negative minimum in the Gibbs free energy change of reaction, and hence, a maximum in the related Keq. We have examined 35 pair-wise, sequence-specific hydrophobic interactions over the temperature range of 273–333 K, based on data reported by Nemethy and Scheraga in 1962. A closer look at a single example, the pair-wise hydrophobic interaction of leucine-isoleucine, will demonstrate the significant differences when the data are analyzed using the Nemethy-Scheraga model or treated by the Planck-Benzinger methodology which we have developed. The change in inherent chemical bond energy at 0 K, ΔH°(T0 is 7.53 kcal mol-1 compared with 2.4 kcal mol-1, while ‹ts› is 365 K as compared with 355 K, for the Nemethy-Scheraga and Planck-Benzinger model, respectively. At ‹tm›, the thermal agitation energy is about five times greater than ΔH°(T0 in the Planck-Benzinger model, that is 465 K compared to 497 K in the Nemethy-Scheraga model. The results imply that the negative Gibbs free energy minimum at a well-defined ‹ts›, where TΔS° = 0 at about 355 K, has its origin in the sequence-specific hydrophobic interactions, which are highly dependent on details of molecular structure. The Nemethy-Scheraga model shows no evidence of the thermodynamic molecular switch that we have found to be a universal feature of biological interactions. The Planck-Benzinger method is the best known for evaluating the innate temperature-invariant enthalpy, ΔH°(T0, and provides for better understanding of the heat of reaction for biological molecules.

  5. Implications of Bilingual Development for Specific Language Impairments in Turkey

    Science.gov (United States)

    Topbas, Seyhun

    2011-01-01

    The potential impact of bilingualism on children's language development has emerged as a crucial concern for Turkey, but so far it has not been addressed from the point of view of language disorders. This short review examines the potential impact of bilingual language development for language impairments in Turkey, with special emphasis on the…

  6. Development of Purification Protocol Specific for Bacteriocin 105B

    Science.gov (United States)

    2017-02-09

    Bacillus anthracis. As the current application of broad-spectrum antimicrobials promotes the development of multi- drug resistant microorganisms...SPECTRUM TARGETED ANTIMICROBIALS ASSAYS PURIFICATION BACILLUS ANTHRACIS DRUG- RESISTANT MICROORGANISMS...through the purification procedure. The wide-spread use of broad-spectrum antimicrobial agents has led to the development of drug resistant

  7. Application and development of Industrial Computed Tomography in China

    International Nuclear Information System (INIS)

    Kong Fangeng; Xian Wu

    1996-01-01

    Compared with traditional perspective radiography, ICT (Industrial Computed Tomography) is able to acquire tomography image without the disadvantages of image overlapping and blurring that exist in traditional perspective radiography. By acquiring the 2D tomography image of the object at different stage as many as needed, it is possible to achieve 3D tomography image. In China, the first Γ-ray ICT equipment was born at Chongqing University in May 1993. For this equipment, 60 Co radiation source with 1 Ci and 30 Ci was used, and spatial resolution is about 0.5mm, and density resolution is about 0.5%, and the diameter of the test object can be 300mm, but the price of the Chinese ICT equipment is only about a half on the same type of ICT equipment producing abroad other than China. Besides Γ-ray ICT, Chinese are engaging in research and develop x-ray ICT to meet foreign and domestic need. (author)

  8. A computational model predicting disruption of blood vessel development.

    Directory of Open Access Journals (Sweden)

    Nicole Kleinstreuer

    2013-04-01

    Full Text Available Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis and remodeling (angiogenesis come from a variety of biological pathways linked to endothelial cell (EC behavior, extracellular matrix (ECM remodeling and the local generation of chemokines and growth factors. Simulating these interactions at a systems level requires sufficient biological detail about the relevant molecular pathways and associated cellular behaviors, and tractable computational models that offset mathematical and biological complexity. Here, we describe a novel multicellular agent-based model of vasculogenesis using the CompuCell3D (http://www.compucell3d.org/ modeling environment supplemented with semi-automatic knowledgebase creation. The model incorporates vascular endothelial growth factor signals, pro- and anti-angiogenic inflammatory chemokine signals, and the plasminogen activating system of enzymes and proteases linked to ECM interactions, to simulate nascent EC organization, growth and remodeling. The model was shown to recapitulate stereotypical capillary plexus formation and structural emergence of non-coded cellular behaviors, such as a heterologous bridging phenomenon linking endothelial tip cells together during formation of polygonal endothelial cords. Molecular targets in the computational model were mapped to signatures of vascular disruption derived from in vitro chemical profiling using the EPA's ToxCast high-throughput screening (HTS dataset. Simulating the HTS data with the cell-agent based model of vascular development predicted adverse effects of a reference anti-angiogenic thalidomide analog, 5HPP-33, on in vitro angiogenesis with respect to both concentration-response and morphological consequences. These findings support the utility of cell agent-based models for simulating a

  9. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    Energy Technology Data Exchange (ETDEWEB)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  10. Cloud Computing in Higher Education Sector for Sustainable Development

    Science.gov (United States)

    Duan, Yuchao

    2016-01-01

    Cloud computing is considered a new frontier in the field of computing, as this technology comprises three major entities namely: software, hardware and network. The collective nature of all these entities is known as the Cloud. This research aims to examine the impacts of various aspects namely: cloud computing, sustainability, performance…

  11. The development of model generators for specific reactors

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J.C. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-07-01

    Authoring reactor models is a routine task for practitioners in nuclear engineering for reactor design, safety analysis, and code validation. The conventional approach is to use a text-editor to either manually manipulate an existing model or to assemble a new model by copying and pasting or direct typing. This approach is error-prone and substantial effort is required for verification. Alternatively, models can be generated programmatically for a specific system via a centralized data source and with rigid algorithms to generate models consistently and efficiently. This approach is demonstrated here for model generators for MCNP and KENO for the ZED-2 reactor. (author)

  12. Protein adsorption on nanoparticles: model development using computer simulation

    International Nuclear Information System (INIS)

    Shao, Qing; Hall, Carol K

    2016-01-01

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler–Guggenheim and Hill–de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles. (paper)

  13. Registration of an enterprise information system development by formal specifications

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2006-01-01

    Full Text Available The economical view from the Enterprise process sets ERP, SCM, CRM, BI, … to a functionality and Enterprise Information System structure by informaticians is demonstrable reality. A comprehensive Enterprise Information System software solution, that respects the mentioned economical platform by large software firms, has got required attributes of a data, process and communication integrity but there is not financially sustainable for small enterprises. These enterprises are predominantly oriented to progressive computerization of enterprise processes and rather gradually buy application packages for individual process sets. Large and small software firms provide needed partial solutions, nevertheless small firms solutions are connected with the data, process and communication disintegration. Since the compatibility requirement is not generally accepted, finding of an EAI solution have become one of the main System Integration tasks. This article provides one specific style for a complex or partial Enterprise Information System solution. This solution is founded on formal and descriptive specifications that can sustain required data, process and communication integration among packages of applications. As a result, this style provides the new view for the effectiveness of the associated process of information modeling.

  14. Development of an educational partnership for enhancement of a computer risk assessment model

    International Nuclear Information System (INIS)

    Topper, K.

    1995-02-01

    The Multimedia Environmental Pollutant Assessment System (MEPAS) is a computer program which evaluates exposure pathways for chemical and radioactive releases according to their potential human health impacts. MEPAS simulates the exposure pathways through standard source-to-receptor transport principles using, a multimedia approach (air, groundwater, overland flow, soil, surface water) in conjunction with specific chemical exposure considerations. This model was originally developed by Pacific Northwest Laboratory (PNL) to prioritize environmental concerns at potentially contaminated US Department of Energy (DOE) sites. Currently MEPAS is being used to evaluate a range of environmental problems which are not restricted to DOE sites. A partnership was developed between PNL and Mesa State College during 1991. This partnership involves the use of undergraduate students, faculty, and PNL personnel to complete enhancements to MEPAS. This has led to major refinements to the original MEPAS shell for DOE in a very cost-effective manner. PNL was awarded a 1993 Federal Laboratory Consortium Award and Mesa State College was awarded an Environmental Restoration and Waste Management Distinguished Faculty Award from DOE in 1993 as a result of this collaboration. The college has benefited through the use of MEPAS within laboratories and through the applied experience gained by the students. Development of this partnership will be presented with the goal of allowing other DOE facilities to replicate this program. It is specifically recommended that DOE establish funded programs which support this type of a relationship on an ongoing basis. Additionally, specific enhancements to MEPAS will be presented through computer display of the program

  15. Magnetic fusion energy and computers: the role of computing in magnetic fusion energy research and development

    International Nuclear Information System (INIS)

    1979-10-01

    This report examines the role of computing in the Department of Energy magnetic confinement fusion program. The present status of the MFECC and its associated network is described. The third part of this report examines the role of computer models in the main elements of the fusion program and discusses their dependence on the most advanced scientific computers. A review of requirements at the National MFE Computer Center was conducted in the spring of 1976. The results of this review led to the procurement of the CRAY 1, the most advanced scientific computer available, in the spring of 1978. The utilization of this computer in the MFE program has been very successful and is also described in the third part of the report. A new study of computer requirements for the MFE program was conducted during the spring of 1979 and the results of this analysis are presented in the forth part of this report

  16. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  17. [Specific disturbances of psychomotor development in children with thymomegaly].

    Science.gov (United States)

    Ignat'eva, O N; Kuz'menko, L G; Kozlovskaia, G V; Kliushnik, T P

    2008-01-01

    Ninety children, aged from 2 month to 3 years, with thymomegaly and 25 aged-matched controls were studied. Most children with thymomegaly had disturbances of psychomotor development. Depending on their types, the cohort of children was stratified into 4 subgroups: 1st - 36 patients (40%) with schizotypal signs; 2nd - 30 hyperactive children (33%); 3rd - 19 children with hyperthymia signs (21%); 4th - 5 normal children (6%). The deviations of locomotion and psychiatric development were correlated with the extent of thymus enlargement and activation of innate and adaptive immunity.

  18. European strategic culture: specifics of formation and prospects for development

    Directory of Open Access Journals (Sweden)

    I. V. Stakhurskyi

    2016-10-01

    It has been also highlighted in the article whether the EU has developed a strong strategic culture by applying four criteria: level of public approval for CSDP, acceptance the EU as an appropriate tool for security and defense policy, attitude towards the use of force, authorization requirement. It has been argued that since establishment of CSDP differences in national strategic cultures have narrowed, but still the EU is far from constructing a strong strategic culture. Finally, it has been made a conclusion that slow process of the European strategic culture development prevents the CSDP from being an effective mechanism for the EU crisis management.

  19. Organ localization: Toward prospective patient-specific organ dosimetry in computed tomography

    International Nuclear Information System (INIS)

    Segars, W. P.; Rybicki, K.; Norris, Hannah; Samei, E.; Frush, D.

    2014-01-01

    Purpose: With increased focus on radiation dose from medical imaging, prospective radiation dose estimates are becoming increasingly desired. Using available populations of adult and pediatric patient phantoms, radiation dose calculations can be catalogued and prospectively applied to individual patients that best match certain anatomical characteristics. In doing so, the knowledge of organ size and location is a required element. Here, the authors develop a predictive model of organ locations and volumes based on an analysis of adult and pediatric computed tomography (CT) data. Methods: Fifty eight adult and 69 pediatric CT datasets were segmented and utilized in the study. The maximum and minimum points of the organs were recorded with respect to the axial distance from the tip of the sacrum. The axial width, midpoint, and volume of each organ were calculated. Linear correlations between these three organ parameters and patient age, BMI, weight, and height were determined. Results: No statistically significant correlations were found in adult patients between the axial width, midpoint, and volume of the organs versus the patient age or BMI. Slight, positive linear trends were found for organ midpoint versus patient weight (max r 2 = 0.382, mean r 2 = 0.236). Similar trends were found for organ midpoint versus height (max r 2 = 0.439, mean r 2 = 0.200) and for organ volume versus height (max r 2 = 0.410, mean r 2 = 0.153). Gaussian fits performed on probability density functions of the adult organs resulted in r 2 -values ranging from 0.96 to 0.996. The pediatric patients showed much stronger correlations overall. Strong correlations were observed between organ axial midpoint versus age, height, and weight (max r 2 = 0.842, mean r 2 = 0.790; max r 2 = 0.949, mean r 2 = 0.894; and max r 2 = 0.870, mean r 2 = 0.847, respectively). Moderate linear correlations were also observed for organ axial width versus height (max r 2 = 0.772, mean r 2 = 0.562) and for organ

  20. High Performance Computing - Power Application Programming Interface Specification Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ward, H. Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  1. Starpc: a library for communication among tools on a parallel computer cluster. User's and developer's guide to Starpc

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro

    2000-02-01

    We report on a RPC(Remote Procedure Call)-based communication library, Starpc, for a parallel computer cluster. Starpc supports communication between Java Applets and C programs as well as between C programs. Starpc has the following three features. (1) It enables communication between Java Applets and C programs on an arbitrary computer without security violation, although Java Applets are supposed to communicate only with programs on the specific computer (Web server) in subject to a restriction on security. (2) Diverse network communication protocols are available on Starpc, because of using Nexus communication library developed at Argonne National Laboratory. (3) It works on many kinds of computers including eight parallel computers and four WS servers. In this report, the usage of Starpc and the development of applications using Starpc are described. (author)

  2. Theories in Developing Oral Communication for Specific Learner Group

    Science.gov (United States)

    Hadi, Marham Jupri

    2016-01-01

    The current article presents some key theories most relevant to the development of oral communication skills in an Indonesian senior high school. Critical analysis on the learners' background is employed to figure out their strengths and weaknesses. The brief overview of the learning context and learners' characteristic are used to identify which…

  3. Development of a specific radioimmunoassay for cortisol 17-butyrate

    International Nuclear Information System (INIS)

    Smith, G.N.; Lee, Y.F.; Bu'Lock, D.E.; August, P.; Anderson, D.C.

    1983-01-01

    We describe the development and validation of an assay for cortisol 17-butyrate in blood in which there is no significant cross reaction with endogenous corticosteroids at levels encountered normally in man. Preliminary data on blood levels of the drug in absorption studies are presented

  4. Career-Specific Parental Behaviors in Adolescents' Development

    Science.gov (United States)

    Dietrich, Julia; Kracke, Barbel

    2009-01-01

    Parents are major partners in helping adolescents prepare for a career choice. Although several studies have examined links between general aspects of the parent-adolescent relationship and adolescents' career development, little research has addressed the mechanisms involved. This study aimed to validate a three-dimensional instrument for the…

  5. Development of specific primers for genus Fusarium and F. solani ...

    African Journals Online (AJOL)

    Yomi

    2012-01-05

    Jan 5, 2012 ... reproductive parts of plants. They are ... plant species in most parts of the world. .... 20 µl 2.5X master mix (Eppendorf) and 1 µl of each forward and ... List of primers developed for rapid detection of Fusarium sp. and F. solani.

  6. Development of Computational Procedure for Assessment of Patient Dose in Multi-Detector Computed Tomography

    International Nuclear Information System (INIS)

    Park, Dong Wook

    2007-02-01

    Technological development to improve the quality and speed with which images are obtained have fostered the growth of frequency and collective effective dose of CT examination. Especially, High-dose x-ray technique of CT has increased in the concern of patient dose. However CTDI and DLP in CT dosimetry leaves something to be desired to evaluate patient dose. And even though the evaluation of effective dose in CT practice is required for comparison with other radiography, it's not sufficient to show any estimation because it's not for medical purpose. Therefore the calculation of effective dose in CT procedure is needed for that purpose. However modelling uncertainties will be due to insufficient information from manufacturing tolerances. Therefore the purpose of this work is development of computational procedure for assessment of patient dose through the experiment for getting essential information in MDCT. In order to obtain exact absorbed dose, normalization factors must be created to relate simulated dose values with CTDI air measurement. The normalization factors applied to the calculation of CTDI 100 using axial scanning and organ effective dose using helical scanning. The calculation of helical scanning was compared with the experiment of Groves et al.(2004). The result has a about factor 2 of the experiment. It seems because AEC is not simulated. In several studies, when AEC applied to a CT examination, approximately 20-30% dose reduction was appeared. Therefore the study of AEC simulation should be added and modified

  7. Comparison of computed tomography based parametric and patient-specific finite element models of the healthy and metastatic spine using a mesh-morphing algorithm.

    Science.gov (United States)

    O'Reilly, Meaghan Anne; Whyne, Cari Marisa

    2008-08-01

    A comparative analysis of parametric and patient-specific finite element (FE) modeling of spinal motion segments. To develop patient-specific FE models of spinal motion segments using mesh-morphing methods applied to a parametric FE model. To compare strain and displacement patterns in parametric and morphed models for both healthy and metastatically involved vertebrae. Parametric FE models may be limited in their ability to fully represent patient-specific geometries and material property distributions. Generation of multiple patient-specific FE models has been limited because of computational expense. Morphing methods have been successfully used to generate multiple specimen-specific FE models of caudal rat vertebrae. FE models of a healthy and a metastatic T6-T8 spinal motion segment were analyzed with and without patient-specific material properties. Parametric and morphed models were compared using a landmark-based morphing algorithm. Morphing of the parametric FE model and including patient-specific material properties both had a strong impact on magnitudes and patterns of vertebral strain and displacement. Small but important geometric differences can be represented through morphing of parametric FE models. The mesh-morphing algorithm developed provides a rapid method for generating patient-specific FE models of spinal motion segments.

  8. Present state of computer-aided diagnosis (CAD) development

    International Nuclear Information System (INIS)

    Fujita, Hiroshi

    2007-01-01

    Topics of computer-aided detection (CAD) are reviewed. Commercially available, Food and Drug Administration (FDA)-approved CAD systems are for fields of breast cancer (mammography), chest (flat X-ray and CT imaging) and colon (polyp detection). In Japan, only mammography CAD is approved. Efficacy of CAD is controversial, for which reliable database is important, and its construction is under development in various medical fields. Digitalized image is now popularized, which conceivably leads to improve the cost-effectiveness of diagnosis with CAD. For incentive, approval for health insurance would be the case as seen in the increased CAD sale by R2 Technology Co., and MHLW actually assists facilities to introduce the reading-aid system of mammography by sharing a half of its cost. There are 2 big projects for CAD study supported by MECSST, which the author concerns. One is the development of diagnostic aid for the multi-dimensional medical images where the multi-organ, multi-disease CAD system is considered. The other involves the CAD in brain MRI, in breast US and in eyeground picture. It is not in so far future for patients and doctors to fully enjoy the benefit of CAD. (R.T.)

  9. Developments of multibody system dynamics: computer simulations and experiments

    International Nuclear Information System (INIS)

    Yoo, Wan-Suk; Kim, Kee-Nam; Kim, Hyun-Woo; Sohn, Jeong-Hyun

    2007-01-01

    It is an exceptional success when multibody dynamics researchers Multibody System Dynamics journal one of the most highly ranked journals in the last 10 years. In the inaugural issue, Professor Schiehlen wrote an interesting article explaining the roots and perspectives of multibody system dynamics. Professor Shabana also wrote an interesting article to review developments in flexible multibody dynamics. The application possibilities of multibody system dynamics have grown wider and deeper, with many application examples being introduced with multibody techniques in the past 10 years. In this paper, the development of multibody dynamics is briefly reviewed and several applications of multibody dynamics are described according to the author's research results. Simulation examples are compared to physical experiments, which show reasonableness and accuracy of the multibody formulation applied to real problems. Computer simulations using the absolute nodal coordinate formulation (ANCF) were also compared to physical experiments; therefore, the validity of ANCF for large-displacement and large-deformation problems was shown. Physical experiments for large deformation problems include beam, plate, chain, and strip. Other research topics currently being carried out in the author's laboratory are also briefly explained

  10. Development of computed tomography instrument for college teaching

    International Nuclear Information System (INIS)

    Liu Fenglin; Lu Yanping; Wang Jue

    2006-01-01

    Computed tomography (CT), which uses penetrating radiation from many directions to reconstruct cross-sectional or 3D images of object, has widely applied in medical diagnosis and treatment, industrial NDT and NDE. So it is significant for college students to understand the fundamental of CT. The authors describe the CD-50BG CT instrument developed for experimental teaching at colleges. With 50 mm field-of-view and the translation-rotation scanning mode, the system makes use of a single plastic scintillator + photomultiplier detector and a 137 Cs radioactive source with 0.74 GBq activity, which is housed in a tungsten alloy shield. At the same time, an image processing software has been developed to process the acquired data, so that cross-sectional and 3D images can be reconstructed. High quality images with 1 lp·mm -1 spatial resolution and 1% contrast sensitivity are obtained. So far in China, more than ten institutions including Tsinghua University and Peking University have already applied the system to elementary teaching. (authors)

  11. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  12. Project management plan double-shell tank system specification development

    International Nuclear Information System (INIS)

    Conrads, T.J.

    1998-01-01

    The Project Hanford Management Contract (PHMC) members have been tasked by the US Department of Energy (DOE) to support removal of wastes from the Hanford Site 200 Area tanks in two phases. The schedule for these phases allows focusing on requirements for the first phase of providing feed to the privatized vitrification plants. The Tank Waste Retrieval Division near-term goal is to focus on the activities to support Phase 1. These include developing an integrated (technical, schedule, and cost) baseline and, with regard to private contractors, establishing interface agreements, constructing infrastructure systems, retrieving and delivering waste feed, and accepting immobilized waste products for interim onsite storage. This document describes the process for developing an approach to designing a system for retrieving waste from double-shell tanks. It includes a schedule and cost account for the work breakdown structure task

  13. A robust computational solution for automated quantification of a specific binding ratio based on [123I]FP-CIT SPECT images

    International Nuclear Information System (INIS)

    Oliveira, F. P. M.; Tavares, J. M. R. S.; Borges, Faria D.; Campos, Costa D.

    2014-01-01

    The purpose of the current paper is to present a computational solution to accurately quantify a specific to a non-specific uptake ratio in [ 123 I]fP-CIT single photon emission computed tomography (SPECT) images and simultaneously measure the spatial dimensions of the basal ganglia, also known as basal nuclei. A statistical analysis based on a reference dataset selected by the user is also automatically performed. The quantification of the specific to non-specific uptake ratio here is based on regions of interest defined after the registration of the image under study with a template image. The computational solution was tested on a dataset of 38 [ 123 I]FP-CIT SPECT images: 28 images were from patients with Parkinson’s disease and the remainder from normal patients, and the results of the automated quantification were compared to the ones obtained by three well-known semi-automated quantification methods. The results revealed a high correlation coefficient between the developed automated method and the three semi-automated methods used for comparison (r ≥0.975). The solution also showed good robustness against different positions of the patient, as an almost perfect agreement between the specific to non-specific uptake ratio was found (ICC=1.000). The mean processing time was around 6 seconds per study using a common notebook PC. The solution developed can be useful for clinicians to evaluate [ 123 I]FP-CIT SPECT images due to its accuracy, robustness and speed. Also, the comparison between case studies and the follow-up of patients can be done more accurately and proficiently since the intra- and inter-observer variability of the semi-automated calculation does not exist in automated solutions. The dimensions of the basal ganglia and their automatic comparison with the values of the population selected as reference are also important for professionals in this area.

  14. The computer code SEURBNUK/EURDYN (release 1). Input and output specifications

    International Nuclear Information System (INIS)

    Smith, B.L.; Broadhouse, B.J.; Yerkess, A.

    1986-05-01

    SEURBNUK-2 is a two-dimensional, axisymmetric, Eulerian, finite difference containment code developed initially by AWRE Aldermaston, AEE Winfrith and JRC-Ispra, and more recently by AEEW, JRC and EIR Wuerenlingen. The numerical procedure adopted in SEURBNUK to solve the hydrodynamic equations is based on the semi-implicit ICE method which itself is an extension of the MAC algorithm. SEURBNUK has a finite difference thin shell treatment for vessels and internal structures of arbitrary shape and includes the effects of the compressibility of the fluid. Fluid flow through porous media and porous structures can also be accommodated. SEURBNUK/EURDYN is an extension of SEURBNUK-2 in which the finite difference thin shell treatment is replaced by a finite element calculation for both thin or thick structures. This has been achieved by coupling the finite element code EURDYN with SEURBNUK-2, allowing the use of conical shell elements and axisymmetric triangular elements. Within the code, the equations of motion for the structures are solved quite separately from those for the fluid, and the timestep for the fluid can be an integer multiple of that for the structures. The interaction of the structures with the fluid is then considered as a modification to the coefficients in the pressure equations, the modifications naturally depending on the behaviour of the structures within the fluid cell. The code is limited to dealing with a single fluid, the coolant, and the bubble and the cover gas are treated as cavities of uniform pressure calculated via appropriate pressure-volume-energy relationships. This manual describes the input data specifications needed for the execution of SEURBNUK/EURDYN calculations. After explaining the output facilities information is included to aid users to avoid some common pit-falls. (author)

  15. On the impact of quantum computing technology on future developments in high-performance scientific computing

    OpenAIRE

    Möller, Matthias; Vuik, Cornelis

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to researchers and vendors of future computing technologies, national authorities are showing strong interest in maturing this technology due to its known potential to break many of today’s encryption technique...

  16. Development of age-specific Japanese physical phantoms for dose evaluation in infant CT examinations

    International Nuclear Information System (INIS)

    Yamauchi-Kawaura, C.; Fujii, K.; Imai, K.; Ikeda, M.; Akahane, K.; Obara, S.; Yamauchi, M.; Narai, K.; Katsu, T.

    2016-01-01

    Secondary to the previous development of age-specific Japanese head phantoms, the authors designed Japanese torso phantoms for dose assessment in infant computed tomography (CT) examinations and completed a Japanese 3-y-old head-torso phantom. For design of age-specific torso phantoms (0, 0.5, 1 and 3 y old), anatomical structures were measured from CT images of Japanese infant patients. From the CT morphometry, it was found that rib cages of Japanese infants were smaller than those in Europeans and Americans. Radiophotoluminescence glass dosemeters were used for dose measurement of a 3-y-old head-torso phantom. To examine the validity of the developed phantom, organ and effective doses by the in-phantom dosimetry system were compared with simulation values in a web-based CT dose calculation system (WAZA-ARI). The differences in doses between the two systems were <20 % at the doses of organs within scan regions and effective doses in head, chest and abdomino-pelvic CT examinations. (authors)

  17. Development of a patient-specific anatomical foot model from structured light scan data.

    Science.gov (United States)

    Lochner, Samuel J; Huissoon, Jan P; Bedi, Sanjeev S

    2014-01-01

    The use of anatomically accurate finite element (FE) models of the human foot in research studies has increased rapidly in recent years. Uses for FE foot models include advancing knowledge of orthotic design, shoe design, ankle-foot orthoses, pathomechanics, locomotion, plantar pressure, tissue mechanics, plantar fasciitis, joint stress and surgical interventions. Similar applications but for clinical use on a per-patient basis would also be on the rise if it were not for the high costs associated with developing patient-specific anatomical foot models. High costs arise primarily from the expense and challenges of acquiring anatomical data via magnetic resonance imaging (MRI) or computed tomography (CT) and reconstructing the three-dimensional models. The proposed solution morphs detailed anatomy from skin surface geometry and anatomical landmarks of a generic foot model (developed from CT or MRI) to surface geometry and anatomical landmarks acquired from an inexpensive structured light scan of a foot. The method yields a patient-specific anatomical foot model at a fraction of the cost of standard methods. Average error for bone surfaces was 2.53 mm for the six experiments completed. Highest accuracy occurred in the mid-foot and lowest in the forefoot due to the small, irregular bones of the toes. The method must be validated in the intended application to determine if the resulting errors are acceptable.

  18. Revolutionary development of computer education : A success story

    OpenAIRE

    Nandasara, S. T.; Samaranayake, V. K.; Mikami, Yoshiki

    2006-01-01

    The University of Colombo, Sri Lanka has been in the forefront of the “Computer Revolution” in Sri Lanka. It has introduced the teaching of computer programming and applications as early as in 1967, more than a decade before other educational institutions, thereby producing, over the years, a large number of pioneer computer scientists and IT graduates out of students entering the university from a variety of disciplines. They are presently employed as researchers, educators, data processing ...

  19. DEVELOP NEW TOTAL ORGANIC CARBON/SPECIFIC UV ...

    Science.gov (United States)

    The purpose of this project is to provide a total organic carbon (TOC)/specific ultraviolet absorbance (SUVA) method that will be used by the Office of Ground Water and Drinking Water (OGWDW) to support monitoring requirements of the Stage 2 Disinfectant/Disinfection By-products (D/DBP) Rule. The Stage 2 Rule requires that enhanced water treatment be used if the source water is high in aquatic organic matter prior to the application of a disinfectant. Disinfectants (chlorine, ozone, etc.) are used in the production of drinking water in order to reduce the risk of microbial disease. These disinfectants react with the organic material that is naturally present in the source water to form disinfection by-products (DBPs). Exposure to some of these by-products may pose a long term health risk. The number and nature of DBPs make it impossible to fully characterize all of the by-products formed during the treatment of drinking water and it is more cost effective to reduce formation of DBPs than to remove them from the water after they are formed. Two measurements (TOC and SUVA) are believed to be predictive of the amount of by-products that can be formed during the disinfection of drinking water and are considered to be surrogates for DBP precursors. SUVA is calculated as the ultraviolet absorption at 254nm (UV254) in cm-1 divided by the mg/L dissolved organic carbon (DOC) concentration (measured after filtration of the water through a 0.45um pore-diameter filte

  20. Development of specific dopamine D-1 agonists and antagonists

    International Nuclear Information System (INIS)

    Sakolchai, S.

    1987-01-01

    To develop potentially selective dopamine D-1 agonists and to investigate on the structural requirement for D-1 activity, the derivatives of dibenzocycloheptadiene are synthesized and pharmacologically evaluated. The target compounds are 5-aminomethyl-10,11-dihydro-1,2-dihydroxy-5H-dibenzo[a,d]cycloheptene hydrobromide 10 and 9,10-dihydroxy-1,2,3,7,8,12b-hexahydrobenzo[1,2]cyclohepta[3,4,5d,e]isoquinoline hydrobromide 11. In a dopamine-sensitive rat retinal adenylate cyclase assay, a model for D-1 activity, compound 10 is essentially inert for both agonist and antagonist activity. In contrast, compound 11 is approximately equipotent to dopamine in activation of the D-1 receptor. Based on radioligand and binding data, IC 50 of compound 11 for displacement of 3 H-SCH 23390, a D-1 ligand, is about 7 fold less than that for displacement of 3 H-spiperone, a D-2 ligand. These data indicate that compound 11 is a potent selective dopamine D-1 agonist. This study provides a new structural class of dopamine D-1 acting agent: dihydroxy-benzocycloheptadiene analog which can serve as a lead compound for further drug development and as a probe for investigation on the nature of dopamine D-1 receptor

  1. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  2. HAL/SM language specification. [programming languages and computer programming for space shuttles

    Science.gov (United States)

    Williams, G. P. W., Jr.; Ross, C.

    1975-01-01

    A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.

  3. High Performance Computing - Power Application Programming Interface Specification Version 1.4

    Energy Technology Data Exchange (ETDEWEB)

    Laros III, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); DeBonis, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  4. Development of innovative computer software to facilitate the setup and computation of water quality index.

    Science.gov (United States)

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  5. On the impact of quantum computing technology on future developments in high-performance scientific computing

    NARCIS (Netherlands)

    Möller, M.; Vuik, C.

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to

  6. Development of a voxel phantom specific for simulation of eye brachytherapy

    International Nuclear Information System (INIS)

    Santos, Marcilio S.; Lima, Fernando R.A.

    2013-01-01

    The ophthalmic brachytherapy involves inserting a plate with seeds of radioactive material in the patient's eye for the treatment of tumors. The radiation dose to be taken by the patient is prescribed by physicians and time of application of the material is calculated from calibration curves supplied by the manufacturers of the plates. To estimate the dose absorbed by the patient, in a series of diagnostic tests, it is necessary to perform simulations using a computational model of exposure. These models are composed primarily by a anthropomorphic phantom, and a Monte Carlo code. The coupling of a phantom voxel whole body to a Monte Carlo code is a complex process because the computer model simulations with exposure takes time, knowledge of the code used and various adjustments to be implemented. The problem is aggravated even more complex when you want to radiate one region of the body. In this work we developed a phantom, specifically the region containing the eyeball, from MASH (Male Adult voxel). This model was coupled to the Monte Carlo code EGSnrc (Electron Gamma Shower) together with an algorithm simulator source of I-125 , considering only its effect of higher energy range

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. Development of DUST: A computer code that calculates release rates from a LLW disposal unit

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1992-01-01

    Performance assessment of a Low-Level Waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the disposal unit source term). The major physical processes that influence the source term are water flow, container degradation, waste form leaching, and radionuclide transport. A computer code, DUST (Disposal Unit Source Term) has been developed which incorporates these processes in a unified manner. The DUST code improves upon existing codes as it has the capability to model multiple container failure times, multiple waste form release properties, and radionuclide specific transport properties. Verification studies performed on the code are discussed

  9. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  10. Development of COMPAS, computer aided process flowsheet design and analysis system of nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Homma, Shunji; Sakamoto, Susumu; Takanashi, Mitsuhiro; Nammo, Akihiko; Satoh, Yoshihiro; Soejima, Takayuki; Koga, Jiro; Matsumoto, Shiro

    1995-01-01

    A computer aided process flowsheet design and analysis system, COMPAS has been developed in order to carry out the flowsheet calculation on the process flow diagram of nuclear fuel reprocessing. All of equipments, such as dissolver, mixer-settler, and so on, in the process flowsheet diagram are graphically visualized as icon on a bitmap display of UNIX workstation. Drawing of a flowsheet can be carried out easily by the mouse operation. Not only a published numerical simulation code but also a user's original one can be used on the COMPAS. Specifications of the equipment and the concentration of components in the stream displayed as tables can be edited by a computer user. Results of calculation can be also displayed graphically. Two examples show that the COMPAS is applicable to decide operating conditions of Purex process and to analyze extraction behavior in a mixer-settler extractor. (author)

  11. Development and validation of GWHEAD, a three-dimensional groundwater head computer code

    International Nuclear Information System (INIS)

    Beckmeyer, R.R.; Root, R.W.; Routt, K.R.

    1980-03-01

    A computer code has been developed to solve the groundwater flow equation in three dimensions. The code has finite-difference approximations solved by the strongly implicit solution procedure. Input parameters to the code include hydraulic conductivity, specific storage, porosity, accretion (recharge), and initial hydralic head. These parameters may be input as varying spatially. The hydraulic conductivity may be input as isotropic or anisotropic. The boundaries either may permit flow across them or may be impermeable. The code has been used to model leaky confined groundwater conditions and spherical flow to a continuous point sink, both of which have exact analytical solutions. The results generated by the computer code compare well with those of the analytical solutions. The code was designed to be used to model groundwater flow beneath fuel reprocessing and waste storage areas at the Savannah River Plant

  12. Simple diazonium chemistry to develop specific gene sensing platforms.

    Science.gov (United States)

    Revenga-Parra, M; García-Mendiola, T; González-Costas, J; González-Romero, E; Marín, A García; Pau, J L; Pariente, F; Lorenzo, E

    2014-02-27

    A simple strategy for covalent immobilizing DNA sequences, based on the formation of stable diazonized conducting platforms, is described. The electrochemical reduction of 4-nitrobenzenediazonium salt onto screen-printed carbon electrodes (SPCE) in aqueous media gives rise to terminal grafted amino groups. The presence of primary aromatic amines allows the formation of diazonium cations capable to react with the amines present at the DNA capture probe. As a comparison a second strategy based on the binding of aminated DNA capture probes to the developed diazonized conducting platforms through a crosslinking agent was also employed. The resulting DNA sensing platforms were characterized by cyclic voltammetry, electrochemical impedance spectroscopy and spectroscopic ellipsometry. The hybridization event with the complementary sequence was detected using hexaamineruthenium (III) chloride as electrochemical indicator. Finally, they were applied to the analysis of a 145-bp sequence from the human gene MRP3, reaching a detection limit of 210 pg μL(-1). Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Development of computational technique for labeling magnetic flux-surfaces

    International Nuclear Information System (INIS)

    Nunami, Masanori; Kanno, Ryutaro; Satake, Shinsuke; Hayashi, Takaya; Takamaru, Hisanori

    2006-03-01

    In recent Large Helical Device (LHD) experiments, radial profiles of ion temperature, electric field, etc. are measured in the m/n=1/1 magnetic island produced by island control coils, where m is the poloidal mode number and n the toroidal mode number. When the transport of the plasma in the radial profiles is numerically analyzed, an average over a magnetic flux-surface in the island is a very useful concept to understand the transport. On averaging, a proper labeling of the flux-surfaces is necessary. In general, it is not easy to label the flux-surfaces in the magnetic field with the island, compared with the case of a magnetic field configuration having nested flux-surfaces. In the present paper, we have developed a new computational technique to label the magnetic flux-surfaces. This technique is constructed by using an optimization algorithm, which is known as an optimization method called the simulated annealing method. The flux-surfaces are discerned by using two labels: one is classification of the magnetic field structure, i.e., core, island, ergodic, and outside regions, and the other is a value of the toroidal magnetic flux. We have applied the technique to an LHD configuration with the m/n=1/1 island, and successfully obtained the discrimination of the magnetic field structure. (author)

  14. Development and validation of the computer program TNHXY

    International Nuclear Information System (INIS)

    Xolocostli M, V.; Valle G, E. del; Alonso V, G.

    2003-01-01

    This work describes the development and validation of the computer program TNHXY (Neutron Transport with Nodal Hybrid schemes in X Y geometry), which solves the discrete-ordinates neutron transport equations using a discontinuous Bi-Linear (DBiL) nodal hybrid method. One of the immediate applications of TNHXY is in the analysis of nuclear fuel assemblies, in particular those of BWRs. Its validation was carried out by reproducing some results for test or benchmark problems that some authors have solved using other numerical techniques. This allows to ensure that the program will provide results with similar accuracy for other problems of the same type. To accomplish this two benchmark problems have been solved. The first problem consists in a BWR fuel assembly in a 7x7 array without and with control rod. The results obtained with TNHXY are consistent with those reported for the TWOTRAN code. The second benchmark problem is a Mixed Oxide (MOX) fuel assembly in a 10x10 array. This last problem is known as the WPPR benchmark problem of the NEA Data Bank and the results are compared with those obtained with commercial codes like HELIOS, MCNP-4B and CPM-3. (Author)

  15. Computer-Mediated Collaborative Projects: Processes for Enhancing Group Development

    Science.gov (United States)

    Dupin-Bryant, Pamela A.

    2008-01-01

    Groups are a fundamental part of the business world. Yet, as companies continue to expand internationally, a major challenge lies in promoting effective communication among employees who work in varying time zones. Global expansion often requires group collaboration through computer systems. Computer-mediated groups lead to different communicative…

  16. Developing Digital Immigrants' Computer Literacy: The Case of Unemployed Women

    Science.gov (United States)

    Ktoridou, Despo; Eteokleous-Grigoriou, Nikleia

    2011-01-01

    Purpose: The purpose of this study is to evaluate the effectiveness of a 40-hour computer course for beginners provided to a group of unemployed women learners with no/minimum computer literacy skills who can be characterized as digital immigrants. The aim of the study is to identify participants' perceptions and experiences regarding technology,…

  17. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  18. Recent developments and new directions in soft computing

    CERN Document Server

    Abbasov, Ali; Yager, Ronald; Shahbazova, Shahnaz; Reformat, Marek

    2014-01-01

    The book reports on the latest advances and challenges of soft computing. It  gathers original scientific contributions written by top scientists in the field and covering theories, methods and applications in a number of research areas related to soft-computing, such as decision-making, probabilistic reasoning, image processing, control, neural networks and data analysis.  

  19. The development of mobile computation and the related formal description

    International Nuclear Information System (INIS)

    Jin Yan; Yang Xiaozong

    2003-01-01

    The description and research for formal representation in mobile computation, which is very instructive to resolve the status transmission, domain administration, authentication. This paper presents the descriptive communicating process and computational process from the view of formal calculus, what's more, it construct a practical application used by mobile ambient. Finally, this dissertation shows the future work and direction. (authors)

  20. Developing a Research Agenda for Ubiquitous Computing in Schools

    Science.gov (United States)

    Zucker, Andrew

    2004-01-01

    Increasing numbers of states, districts, and schools provide every student with a computing device; for example, the middle schools in Maine maintain wireless Internet access and the students receive laptops. Research can provide policymakers with better evidence of the benefits and costs of 1:1 computing and establish which factors make 1:1…

  1. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  2. Development and construction of a specific chamber for phototoxicity test

    Energy Technology Data Exchange (ETDEWEB)

    Sufi, Bianca S.; Mathor, Monica B., E-mail: biancasufi@usp.br, E-mail: mathor@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Esteves-Pedro, Natalia M.; Kaneko, Telma Mary, E-mail: nataliamenves@yahoo.com.br, E-mail: tsakuda@usp.br [Universidade de Sao Paulo (USP), Sao Paulo, SP (Brazil). Fac. de Ciencias Farmaceuticas; Lopes, Patricia, E-mail: patricia.lopes@unifesp.br [Universidade Federal de Sao Paulo (UNIFESP), Diadema, SP (Brazil)

    2013-07-01

    Phototoxicity corresponds to the acute toxic response induced after skin exposure 'in vivo' and 'ex vivo' to certain chemicals and subsequent exposure to irradiation. Phototoxicity 'in vitro' assay is determined by viability of fibroblasts BALB/c 3T3 exposed to chemicals in the presence and absence of light. Substances identified as phototoxic are susceptible to 'in vivo' phototoxicity (OECD 432, 2004). A chamber was developed and constructed according to the guidelines OECD Toxicity Guide - 432 and ®ECVAM DB-ALM: INVITTOX N. 78. The chamber was built in stainless steel frame, with UVA lamps and dark area for negative control. The tests to qualify the chamber were performed with Sodium Lauryl Sulfate, recommended by the guides aforementioned, as negative control; and Bergamot oil (Givaudan-Roche), as positive control. Bergamot, Citrus bergamia, has, as major component, Bergapten responsible for its photosensitive activity. Both samples were diluted in Phosphate Buffered Saline with concentrations between 0.005 and 0.1 mg/mL, which were calculated by the dilution factor 1.47. These tests were performed over fibroblast BALB/c 3T3 culture and submitted to phototoxicity assay with MTS dye, under spectrophotometric reading, which allows determining the Photo Irritation Factor (PIF), what suggests that a substance with a PIF<2 predicts no phototoxicity; PIF>2 and <5 provides likely phototoxicity and PIF>5 provides phototoxicity. Sodium Lauryl Sulfate presented a PIF=1, being in accordance with the OECD. Bergamot oil has shown to be likely phototoxic with a PIF=2,475. These results provide that the chamber is qualified to be used to perform phototoxicity tests with assurance and security. (author)

  3. Development and construction of a specific chamber for phototoxicity test

    International Nuclear Information System (INIS)

    Sufi, Bianca S.; Mathor, Monica B.; Esteves-Pedro, Natalia M.; Kaneko, Telma Mary

    2013-01-01

    Phototoxicity corresponds to the acute toxic response induced after skin exposure 'in vivo' and 'ex vivo' to certain chemicals and subsequent exposure to irradiation. Phototoxicity 'in vitro' assay is determined by viability of fibroblasts BALB/c 3T3 exposed to chemicals in the presence and absence of light. Substances identified as phototoxic are susceptible to 'in vivo' phototoxicity (OECD 432, 2004). A chamber was developed and constructed according to the guidelines OECD Toxicity Guide - 432 and ®ECVAM DB-ALM: INVITTOX N. 78. The chamber was built in stainless steel frame, with UVA lamps and dark area for negative control. The tests to qualify the chamber were performed with Sodium Lauryl Sulfate, recommended by the guides aforementioned, as negative control; and Bergamot oil (Givaudan-Roche), as positive control. Bergamot, Citrus bergamia, has, as major component, Bergapten responsible for its photosensitive activity. Both samples were diluted in Phosphate Buffered Saline with concentrations between 0.005 and 0.1 mg/mL, which were calculated by the dilution factor 1.47. These tests were performed over fibroblast BALB/c 3T3 culture and submitted to phototoxicity assay with MTS dye, under spectrophotometric reading, which allows determining the Photo Irritation Factor (PIF), what suggests that a substance with a PIF 2 and 5 provides phototoxicity. Sodium Lauryl Sulfate presented a PIF=1, being in accordance with the OECD. Bergamot oil has shown to be likely phototoxic with a PIF=2,475. These results provide that the chamber is qualified to be used to perform phototoxicity tests with assurance and security. (author)

  4. Development of optimized segmentation map in dual energy computed tomography

    Science.gov (United States)

    Yamakawa, Keisuke; Ueki, Hironori

    2012-03-01

    Dual energy computed tomography (DECT) has been widely used in clinical practice and has been particularly effective for tissue diagnosis. In DECT the difference of two attenuation coefficients acquired by two kinds of X-ray energy enables tissue segmentation. One problem in conventional DECT is that the segmentation deteriorates in some cases, such as bone removal. This is due to two reasons. Firstly, the segmentation map is optimized without considering the Xray condition (tube voltage and current). If we consider the tube voltage, it is possible to create an optimized map, but unfortunately we cannot consider the tube current. Secondly, the X-ray condition is not optimized. The condition can be set empirically, but this means that the optimized condition is not used correctly. To solve these problems, we have developed methods for optimizing the map (Method-1) and the condition (Method-2). In Method-1, the map is optimized to minimize segmentation errors. The distribution of the attenuation coefficient is modeled by considering the tube current. In Method-2, the optimized condition is decided to minimize segmentation errors depending on tube voltagecurrent combinations while keeping the total exposure constant. We evaluated the effectiveness of Method-1 by performing a phantom experiment under the fixed condition and of Method-2 by performing a phantom experiment under different combinations calculated from the total exposure constant. When Method-1 was followed with Method-2, the segmentation error was reduced from 37.8 to 13.5 %. These results demonstrate that our developed methods can achieve highly accurate segmentation while keeping the total exposure constant.

  5. A reliable and valid questionnaire was developed to measure computer vision syndrome at the workplace.

    Science.gov (United States)

    Seguí, María del Mar; Cabrero-García, Julio; Crespo, Ana; Verdú, José; Ronda, Elena

    2015-06-01

    To design and validate a questionnaire to measure visual symptoms related to exposure to computers in the workplace. Our computer vision syndrome questionnaire (CVS-Q) was based on a literature review and validated through discussion with experts and performance of a pretest, pilot test, and retest. Content validity was evaluated by occupational health, optometry, and ophthalmology experts. Rasch analysis was used in the psychometric evaluation of the questionnaire. Criterion validity was determined by calculating the sensitivity and specificity, receiver operator characteristic curve, and cutoff point. Test-retest repeatability was tested using the intraclass correlation coefficient (ICC) and concordance by Cohen's kappa (κ). The CVS-Q was developed with wide consensus among experts and was well accepted by the target group. It assesses the frequency and intensity of 16 symptoms using a single rating scale (symptom severity) that fits the Rasch rating scale model well. The questionnaire has sensitivity and specificity over 70% and achieved good test-retest repeatability both for the scores obtained [ICC = 0.802; 95% confidence interval (CI): 0.673, 0.884] and CVS classification (κ = 0.612; 95% CI: 0.384, 0.839). The CVS-Q has acceptable psychometric properties, making it a valid and reliable tool to control the visual health of computer workers, and can potentially be used in clinical trials and outcome research. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    Science.gov (United States)

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  7. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    Computed Tomography Scans by Autumn R Kulaga, Kathryn L Loftis, and Eric Murray Approved for public release; distribution is...Army Research Laboratory Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans by Autumn R Kulaga...Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  8. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  9. Seismic safety margins research program. Phase I. Project VII: systems analysis specifications of computational approach

    International Nuclear Information System (INIS)

    Collins, J.D.; Hudson, J.M.; Chrostowski, J.D.

    1979-02-01

    A computational methodology is presented for the prediction of core melt probabilities in a nuclear power plant due to earthquake events. The proposed model has four modules: seismic hazard, structural dynamic (including soil-structure interaction), component failure and core melt sequence. The proposed modules would operate in series and would not have to be operated at the same time. The basic statistical approach uses a Monte Carlo simulation to treat random and systematic error but alternate statistical approaches are permitted by the program design

  10. Programming Unconventional Computers: Dynamics, Development, Self-Reference

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference.

  11. Gender-specific calculation of the effective dose: The example of thoracic computer tomography

    International Nuclear Information System (INIS)

    Boetticher, H. von; Lachmund, J.; Hoffmann, W.

    2003-01-01

    Systematic gender-specific differences in anatomy and physiology are mostly neglected in standard methodologies for the determination of effective doses. This paper presents and discusses three different concepts for the derivation of gender-specific effective doses. Based on the most convincing approach - especially through the influence of tissue weighting factors for the breast - the effective dose for a serial CT scan of the chest is higher for women (+11%) and lower (-11%) for men in comparison to the 'gender-neutral' average value. These differences amount to ±30% for coronary serial CT applications. (orig.) [de

  12. Developing a multimodal biometric authentication system using soft computing methods.

    Science.gov (United States)

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.

  13. Design and development of a diversified real time computer for future FBRs

    International Nuclear Information System (INIS)

    Sujith, K.R.; Bhattacharyya, Anindya; Behera, R.P.; Murali, N.

    2014-01-01

    The current safety related computer system of Prototype Fast Breeder Reactor (PFBR) under construction in Kalpakkam consists of two redundant Versa Module Europa (VME) bus based Real Time Computer system with a Switch Over Logic Circuit (SOLC). Since both the VME systems are identical, the dual redundant system is prone to common cause failure (CCF). The probability of CCF can be reduced by adopting diversity. Design diversity has long been used to protect redundant systems against common-mode failures. The conventional notion of diversity relies on 'independent' generation of 'different' implementations. This paper discusses the design and development of a diversified Real Time Computer which will replace one of the computer system in the dual redundant architecture. Compact PCI (cPCI) bus systems are widely used in safety critical applications such as avionics, railways, defence and uses diverse electrical signaling and logical specifications, hence was chosen for development of the diversified system. Towards the initial development a CPU card based on an ARM-9 processor, 16 channel Relay Output (RO) card and a 30 channel Analog Input (AI) card was developed. All the cards mentioned supports hot-swap and geographic addressing capability. In order to mitigate the component obsolescence problem the 32 bit PCI target controller and associated glue logic for the slave I/O cards was indigenously developed using VHDL. U-boot was selected as the boot loader and arm Linux 2.6 as the preliminary operating system for the CPU card. Board specific initialization code for the CPU card was written in ARM assembly language and serial port initialization was written in C language. Boot loader along with Linux 2.6 kernel and jffs2 file system was flashed into the CPU card. Test applications written in C language were used to test the various peripherals of the CPU card. Device driver for the AI and RO card was developed as Linux kernel modules and application library was also

  14. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  15. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  16. Treated effluent disposal system process control computer software requirements and specification

    International Nuclear Information System (INIS)

    Graf, F.A. Jr.

    1994-01-01

    The software requirements for the monitor and control system that will be associated with the effluent collection pipeline system known as the 200 Area Treated Effluent Disposal System is covered. The control logic for the two pump stations and specific requirements for the graphic displays are detailed

  17. Cloud Computing: Key to IT Development in West Africa

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-12-01

    Dec 1, 2013 ... Paper, explores the basic concepts of Cloud Computing and how the emerging technology could be harnessed to .... recovery of information than traditional system. Storage ... quickly meet business demand was an important ...

  18. Development and Evaluation of a Computer-Aided Learning (CAL ...

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    Department of Computer & Information Science, Enugu State University of Science & Technology, PO Box 4545, Enugu,. Nigeria . .... teacher to store questions on his own, using a database ..... alternative to lectures in human physiology [12].

  19. X-ray Computed Tomography Image Quality Indicator (IQI) Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Phase one of the program is to identify suitable x-ray Computed Tomography (CT) Image Quality Indicator (IQI) design(s) that can be used to adequately capture CT...

  20. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  1. Computer Sciences Applied to Management at Open University of Catalonia: Development of Competences of Teamworks

    Science.gov (United States)

    Pisa, Carlos Cabañero; López, Enric Serradell

    Teamwork is considered one of the most important professional skills in today's business environment. More specifically, the collaborative work between professionals and information technology managers from various functional areas is a strategic key in competitive business. Several university-level programs are focusing on developing these skills. This article presents the case of the course Computer Science Applied to Management (hereafter CSAM) that has been designed with the objective to develop the ability to work cooperatively in interdisciplinary teams. For their design and development have been addressed to the key elements of efficiency that appear in the literature, most notably the establishment of shared objectives and a feedback system, the management of the harmony of the team, their level of autonomy, independence, diversity and level of supervision. The final result is a subject in which, through a working virtual platform, interdisciplinary teams solve a problem raised by a case study.

  2. Utilisation of computational fluid dynamics techniques for design of molybdenum target specification

    International Nuclear Information System (INIS)

    Yeoh, G.H.; Wassink, D.

    2003-01-01

    A three-dimensional computational fluid dynamics (CFD) model to investigate the hydraulic behaviour within a model of the liner and irradiation rig, located in the central portion of the HIFAR fuel element is described. Flow visualisation and LDV measurements are performed to better understand the fluid flow around the narrow spaces within the irradiation rig, annular target cans and liner. Based on the unstructured meshing consisted of triangular elements and tetrahedrons within the flow space generated for the geometrical structure, the CFD model was able to predict complex flow structures inside the liner containing the irradiation rig and target cans. The reliability of the model was validated against experiments. The predicted flow behaviour was comparable to the experimental observations. Predicted velocities were also found to be in good agreement with LDV measurements. (author)

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  4. Phonon-induced anomalous specific heat of a model nanocrystal by computer simulation

    International Nuclear Information System (INIS)

    Wang, J.; Wolf, D.; Phillpot, S.R.; Gleiter, H.

    1994-10-01

    The authors construct a simple model of a nanocrystalline material in which all the grains are the same size and shape, and in which all the grain boundaries are crystallographically identical. The authors show that the model nanocrystal has a low-temperature specific-heat anomaly similar to that seen in experiment, which arises from the presence of low-frequency phonons localized at the grain boundaries

  5. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  6. Development of new test procedures for measuring fine and coarse aggregates specific gravity.

    Science.gov (United States)

    2009-09-01

    The objective of the research is to develop and evaluate new test methods at determining the specific gravity and absorption of both fine and coarse aggregates. Current methods at determining the specific gravity and absorption of fine and coarse agg...

  7. A Brief Analysis of Development Situations and Trend of Cloud Computing

    Science.gov (United States)

    Yang, Wenyan

    2017-12-01

    in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.

  8. Further development of the computer code ATHLET-CD

    International Nuclear Information System (INIS)

    Weber, Sebastian; Austregesilo, Henrique; Bals, Christine; Band, Sebastian; Hollands, Thorsten; Koellein, Carsten; Lovasz, Liviusz; Pandazis, Peter; Schubert, Johann-Dietrich; Sonnenkalb, Martin

    2016-10-01

    In the framework of the reactor safety research program sponsored by the German Federal Ministry for Economic Affairs and Energy (BMWi), the computer code system ATHLET/ATHLET-CD has been further developed as an analysis tool for the simulation of accidents in nuclear power plants with pressurized and boiling water reactors as well as for the evaluation of accident management procedures. The main objective was to provide a mechanistic analysis tool for best estimate calculations of transients, accidents, and severe accidents with core degradation in light water reactors. With the continued development, the capability of the code system has been largely improved, allowing best estimate calculations of design and beyond design base accidents, and the simulation of advanced core degradation with enhanced model extent in a reasonable calculation time. ATHLET comprises inter alia a 6-equation model, models for the simulation of non-condensable gases and tracking of boron concentration, as well as additional component and process models for the complete system simulation. Among numerous model improvements, the code application has been extended to super critical pressures. The mechanistic description of the dynamic development of flow regimes on the basis of a transport equation for the interface area has been further developed. This ATHLET version is completely integrated in ATHLET-CD. ATHLET-CD further comprises dedicated models for the simulation of fuel and control assembly degradation for both pressurized and boiling water reactors, debris bed with melting in the core region, as well as fission product and aerosol release and transport in the cooling system, inclusive of decay of nuclide inventories and of chemical reactions in the gas phase. The continued development also concerned the modelling of absorber material release, of melting, melt relocation and freezing, and the interaction with the wall of the reactor pressure vessel. The following models were newly

  9. Development of the voxel computational phantoms of pediatric patients and their application to organ dose assessment

    Science.gov (United States)

    Lee, Choonik

    A series of realistic voxel computational phantoms of pediatric patients were developed and then used for the radiation risk assessment for various exposure scenarios. The high-resolution computed tomographic images of live patients were utilized for the development of the five voxel phantoms of pediatric patients, 9-month male, 4-year female, 8-year female, 11-year male, and 14-year male. The phantoms were first developed as head and torso phantoms and then extended into whole body phantoms by utilizing computed tomographic images of a healthy adult volunteer. The whole body phantom series was modified to have the same anthropometrics with the most recent reference data reported by the international commission on radiological protection. The phantoms, named as the University of Florida series B, are the first complete set of the pediatric voxel phantoms having reference organ masses and total heights. As part of the dosimetry study, the investigation on skeletal tissue dosimetry methods was performed for better understanding of the radiation dose to the active bone marrow and bone endosteum. All of the currently available methodologies were inter-compared and benchmarked with the paired-image radiation transport model. The dosimetric characteristics of the phantoms were investigated by using Monte Carlo simulation of the broad parallel beams of external phantom in anterior-posterior, posterior-anterior, left lateral, right lateral, rotational, and isotropic angles. Organ dose conversion coefficients were calculated for extensive photon energies and compared with the conventional stylized pediatric phantoms of Oak Ridge National Laboratory. The multi-slice helical computed tomography exams were simulated using Monte Carlo simulation code for various exams protocols, head, chest, abdomen, pelvis, and chest-abdomen-pelvis studies. Results have found realistic estimates of the effective doses for frequently used protocols in pediatric radiology. The results were very

  10. Seismocardiography-Based Cardiac Computed Tomography Gating Using Patient-Specific Template Identification and Detection.

    Science.gov (United States)

    Yao, Jingting; Tridandapani, Srini; Wick, Carson A; Bhatti, Pamela T

    2017-01-01

    To more accurately trigger cardiac computed tomography angiography (CTA) than electrocardiography (ECG) alone, a sub-system is proposed as an intermediate step toward fusing ECG with seismocardiography (SCG). Accurate prediction of quiescent phases is crucial to prospectively gating CTA, which is susceptible to cardiac motion and, thus, can affect the diagnostic quality of images. The key innovation of this sub-system is that it identifies the SCG waveform corresponding to heart sounds and determines their phases within the cardiac cycles. Furthermore, this relationship is modeled as a linear function with respect to heart rate. For this paper, B-mode echocardiography is used as the gold standard for identifying the quiescent phases. We analyzed synchronous ECG, SCG, and echocardiography data acquired from seven healthy subjects (mean age: 31; age range: 22-48; males: 4) and 11 cardiac patients (mean age: 56; age range: 31-78; males: 6). On average, the proposed algorithm was able to successfully identify 79% of the SCG waveforms in systole and 68% in diastole. The simulated results show that SCG-based prediction produced less average phase error than that of ECG. It was found that the accuracy of ECG-based gating is more susceptible to increases in heart rate variability, while SCG-based gating is susceptible to high cycle to cycle variability in morphology. This pilot work of prediction using SCG waveforms enriches the framework of a comprehensive system with multiple modalities that could potentially, in real time, improve the image quality of CTA.

  11. Development of a 3-dimensional seismic isolation floor for computer systems

    International Nuclear Information System (INIS)

    Kurihara, M.; Shigeta, M.; Nino, T.; Matsuki, T.

    1991-01-01

    In this paper, we investigated the applicability of a seismic isolation floor as a method for protecting computer systems from strong earthquakes, such as computer systems in nuclear power plants. Assuming that the computer system is guaranteed for 250 cm/s 2 of input acceleration in the horizontal and vertical directions as the seismic performance, the basic design specification of the seismic isolation floor is considered as follows. Against S 1 level earthquakes, the maximum acceleration response of the seismic isolation floor in the horizontal and vertical directions is kept less than 250 cm/s 2 to maintain continuous computer operation. Against S 2 level earthquakes, the isolation floor allows large horizontal movement and large displacement of the isolation devices to reduce the acceleration response, although it is not guaranteed to be less than 250 cm/s 2 . By reducing the acceleration response, however, serious damage to the computer systems is reduced, so that they can be restarted after an earthquake. Usually, seismic isolation floor systems permit 2-dimensional (horizontal) isolation. However, in the case of just-under-seated earthquakes, which have large vertical components, the vertical acceleration response of this system is amplified by the lateral vibration of the frame of the isolation floor. Therefore, in this study a 3-dimensional seismic isolation floor, including vertical isolation, was developed. This paper describes 1) the experimental results of the response characteristics of the 3-dimensional seismic isolation floor built as a trial using a 3-dimensional shaking table, and 2) comparison of a 2-dimensional analytical model, for motion in one horizontal direction and the vertical direction, to experimental results. (J.P.N.)

  12. Developments in medical image processing and computational vision

    CERN Document Server

    Jorge, Renato

    2015-01-01

    This book presents novel and advanced topics in Medical Image Processing and Computational Vision in order to solidify knowledge in the related fields and define their key stakeholders. It contains extended versions of selected papers presented in VipIMAGE 2013 – IV International ECCOMAS Thematic Conference on Computational Vision and Medical Image, which took place in Funchal, Madeira, Portugal, 14-16 October 2013.  The twenty-two chapters were written by invited experts of international recognition and address important issues in medical image processing and computational vision, including: 3D vision, 3D visualization, colour quantisation, continuum mechanics, data fusion, data mining, face recognition, GPU parallelisation, image acquisition and reconstruction, image and video analysis, image clustering, image registration, image restoring, image segmentation, machine learning, modelling and simulation, object detection, object recognition, object tracking, optical flow, pattern recognition, pose estimat...

  13. Development of the radiosynthesis of high-specific-activity {sup 123}I-NKJ64

    Energy Technology Data Exchange (ETDEWEB)

    Tavares, Adriana Alexandre S., E-mail: a.tavares.1@research.gla.ac.u [Institute of Neuroscience and Psychology, College of Medical, Veterinary and Life Sciences, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Jobson, Nicola K. [WestCHEM, School of Chemistry, The Joseph Black Building, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Dewar, Deborah [Institute of Neuroscience and Psychology, College of Medical, Veterinary and Life Sciences, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Sutherland, Andrew [WestCHEM, School of Chemistry, The Joseph Black Building, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Pimlott, Sally L. [West of Scotland Radionuclide Dispensary, University of Glasgow and North Glasgow University Hospital NHS Trust, G11 6NT Glasgow (United Kingdom)

    2011-05-15

    Introduction: {sup 123}I-NKJ64, a reboxetine analogue, is currently under development as a potential novel single photon emission computed tomography radiotracer for imaging the noradrenaline transporter in brain. This study describes the development of the radiosynthesis of {sup 123}I-NKJ64, highlighting the advantages and disadvantages, pitfalls and solutions encountered while developing the final radiolabelling methodology. Methods: The synthesis of {sup 123}I-NKJ64 was evaluated using an electrophilic iododestannylation method, where a Boc-protected trimethylstannyl precursor was radioiodinated using peracetic acid as an oxidant and deprotection was investigated using either trifluoroacetic acid (TFA) or 2 M hydrochloric acid (HCl). Results: Radioiodination of the Boc-protected trimethylstannyl precursor was achieved with an incorporation yield of 92{+-}6%. Deprotection with 2 M HCl produced {sup 123}I-NKJ64 with the highest radiochemical yield of 98.05{+-}1.63% compared with 83.95{+-}13.24% with TFA. However, the specific activity of the obtained {sup 123}I-NKJ64 was lower when measured after using 2 M HCl (0.15{+-}0.23 Ci/{mu}mol) as the deprotecting agent in comparison to TFA (1.76{+-}0.60 Ci/{mu}mol). Further investigation of the 2 M HCl methodology found a by-product, identified as the deprotected proto-destannylated precursor, which co-eluted with {sup 123}I-NKJ64 during the high-performance liquid chromatography (HPLC) purification. Conclusions: The radiosynthesis of {sup 123}I-NKJ64 was achieved with good isolated radiochemical yield of 68% and a high specific activity of 1.8 Ci/{mu}mol. TFA was found to be the most suitable deprotecting agent, since 2 M HCl generated a by-product that could not be fully separated from {sup 123}I-NKJ64 using the HPLC methodology investigated. This study highlights the importance of HPLC purification and accurate measurement of specific activity while developing new radiosynthesis methodologies.

  14. Development of the radiosynthesis of high-specific-activity 123I-NKJ64

    International Nuclear Information System (INIS)

    Tavares, Adriana Alexandre S.; Jobson, Nicola K.; Dewar, Deborah; Sutherland, Andrew; Pimlott, Sally L.

    2011-01-01

    Introduction: 123 I-NKJ64, a reboxetine analogue, is currently under development as a potential novel single photon emission computed tomography radiotracer for imaging the noradrenaline transporter in brain. This study describes the development of the radiosynthesis of 123 I-NKJ64, highlighting the advantages and disadvantages, pitfalls and solutions encountered while developing the final radiolabelling methodology. Methods: The synthesis of 123 I-NKJ64 was evaluated using an electrophilic iododestannylation method, where a Boc-protected trimethylstannyl precursor was radioiodinated using peracetic acid as an oxidant and deprotection was investigated using either trifluoroacetic acid (TFA) or 2 M hydrochloric acid (HCl). Results: Radioiodination of the Boc-protected trimethylstannyl precursor was achieved with an incorporation yield of 92±6%. Deprotection with 2 M HCl produced 123 I-NKJ64 with the highest radiochemical yield of 98.05±1.63% compared with 83.95±13.24% with TFA. However, the specific activity of the obtained 123 I-NKJ64 was lower when measured after using 2 M HCl (0.15±0.23 Ci/μmol) as the deprotecting agent in comparison to TFA (1.76±0.60 Ci/μmol). Further investigation of the 2 M HCl methodology found a by-product, identified as the deprotected proto-destannylated precursor, which co-eluted with 123 I-NKJ64 during the high-performance liquid chromatography (HPLC) purification. Conclusions: The radiosynthesis of 123 I-NKJ64 was achieved with good isolated radiochemical yield of 68% and a high specific activity of 1.8 Ci/μmol. TFA was found to be the most suitable deprotecting agent, since 2 M HCl generated a by-product that could not be fully separated from 123 I-NKJ64 using the HPLC methodology investigated. This study highlights the importance of HPLC purification and accurate measurement of specific activity while developing new radiosynthesis methodologies.

  15. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, Shih-Yew

    1995-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the U.S. Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities. (author)

  16. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, S.Y.

    1994-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the US Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities

  17. An Interactive Computer-Based Circulation System: Design and Development

    Directory of Open Access Journals (Sweden)

    James S. Aagaard

    1972-03-01

    Full Text Available An on-line computer-based circulation control system has been installed at the Northwestern University library. Features of the system include self-service book charge, remote terminal inquiry and update, and automatic production of notices for call-ins and books available. Fine notices are also prepared daily and overdue notices weekly. Important considerations in the design of the system were to minimize costs of operation and to include technical services functions eventually. The system operates on a relatively small computer in a multiprogrammed mode.

  18. Biochemical and Computational Analysis of the Substrate Specificities of Cfr and RlmN Methyltransferases

    DEFF Research Database (Denmark)

    Ntokou, Eleni; Hansen, Lykke Haastrup; Kongsted, Jacob

    2015-01-01

    -ray structure of RlmN. We used a trinucleotide as target sequence and assessed its positioning at the active site for methylation. The calculations are in accordance with different poses of the trinucleotide in the two enzymes indicating major evolutionary changes to shift the C2/C8 specificities. To explore......Cfr and RlmN methyltransferases both modify adenine 2503 in 23S rRNA (Escherichia coli numbering). RlmN methylates position C2 of adenine while Cfr methylates position C8, and to a lesser extent C2, conferring antibiotic resistance to peptidyl transferase inhibitors. Cfr and RlmN show high sequence...... interchangeability between Cfr and RlmN we constructed various combinations of their genes. The function of the mixed genes was investigated by RNA primer extension analysis to reveal methylation at 23S rRNA position A2503 and by MIC analysis to reveal antibiotic resistance. The catalytic site is expected...

  19. Computer modeling and laboratory experiments of a specific borehole to surface electrical monitoring technique (BSEMT)

    NARCIS (Netherlands)

    Meekes, J.A.C.; Zhang, X.; Abdul Fattah, R.

    2011-01-01

    Geophysical monitoring of the dynamical behavior of subsurface reservoirs (oil, gas, CO2) remains an important issue in geophysical research. A new idea for reservoir monitoring based on electrical resistivity tomography was developed at TNO. The essential element of the so-called BSEMT (Borehole to

  20. INFLUENCE OF DEVELOPMENT OF COMPUTER TECHNOLOGIES ON TEACHING

    Directory of Open Access Journals (Sweden)

    Olgica Bešić

    2012-09-01

    Full Text Available Our times are characterized by strong changes in technology that have become reality in many areas of society. When compared to production, transport, services, etc education, as a rule, slowly opens to new technologies. However, children at their homes and outside the schools live in a technologically rich environment, and they expect the change in education in accordance with the imperatives of the education for the twenty-first century. In this sense, systems for automated data processing, multimedia systems, then distance learning, virtual schools and other technologies are being introduced into education. They lead to an increase in students' activities, quality evaluation of their knowledge and finally to their progress, all in accordance with individual abilities and knowledge. Mathematics and computers often appear together in the teaching process. Taking into account the teaching of mathematics, computers and software packages have a significant role. The program requirements are not dominant. The emphasis is on mathematical content and the method of presentation. Computers are especially used in solving various mathematical tasks and self-learning of mathematics. Still, many problems that require solutions appear in the process: how to organise lectures, practice, textbooks, collected mathematical problems, written exams, how to assign and check homework. The answers to these questions are not simple and they will probably be sought continuously, with an increasing use of computers in the teaching process. In this paper I have tried to solve some of the questions above.

  1. Evaluating the Effectiveness of Computer Applications in Developing English Learning

    Science.gov (United States)

    Whitaker, James Todd

    2016-01-01

    I examined the effectiveness of self-directed learning and English learning with computer applications on college students in Bangkok, Thailand, in a control-group experimental-group pretest-posttest design. The hypothesis was tested using a t test: two-sample assuming unequal variances to establish the significance of mean scores between the two…

  2. Development of android application for computation of air pollutant ...

    African Journals Online (AJOL)

    Past few decades, human have experienced a revolution in the computer sciences, not only in terms of its ability but also in terms of its use. Advancement of smartphone technology had produced rapid yet incredible invention in many sectors such as construction, agriculture, education, health and many more. This paper ...

  3. Mastering Cognitive Development Theory in Computer Science Education

    Science.gov (United States)

    Gluga, Richard; Kay, Judy; Lister, Raymond; Kleitman, Simon; Kleitman, Sabina

    2013-01-01

    To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that…

  4. A computer-based teaching programme (CBTP) developed for ...

    African Journals Online (AJOL)

    The nursing profession, like other professions, is focused on preparing students for practice, and particular attention must be paid to the ability of student nurses to extend their knowledge and to solve nursing care problems effectively. A computer-based teaching programme (CBTP) for clinical practice to achieve these ...

  5. Deconstructing Hub Drag. Part 2. Computational Development and Anaysis

    Science.gov (United States)

    2013-09-30

    leveraged a Vertical Lift Consortium ( VLC )-funded hub drag scaling research effort. To confirm this objective, correlations are performed with the...Technology™ Demonstrator aircraft using an unstructured computational solver. These simpler faired elliptical geome- tries can prove to be challenging ...possible. However, additional funding was obtained from the Vertical Lift Consortium ( VLC ) to perform this study. This analysis is documented in

  6. The development of a computer technique for the investigation of reactor lattice parameters

    International Nuclear Information System (INIS)

    Joubert, W.R.

    1982-01-01

    An integrated computer technique was developed whereby all the computer programmes needed to calculate reactor lattice parameters from basic neutron data, could be combined in one system. The theory of the computer programmes is explained in detail. Results are given and compared with experimental values as well as those calculated with a standard system

  7. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  8. A computer literacy scale for newly enrolled nursing college students: development and validation.

    Science.gov (United States)

    Lin, Tung-Cheng

    2011-12-01

    Increasing application and use of information systems and mobile technologies in the healthcare industry require increasing nurse competency in computer use. Computer literacy is defined as basic computer skills, whereas computer competency is defined as the computer skills necessary to accomplish job tasks. Inadequate attention has been paid to computer literacy and computer competency scale validity. This study developed a computer literacy scale with good reliability and validity and investigated the current computer literacy of newly enrolled students to develop computer courses appropriate to students' skill levels and needs. This study referenced Hinkin's process to develop a computer literacy scale. Participants were newly enrolled first-year undergraduate students, with nursing or nursing-related backgrounds, currently attending a course entitled Information Literacy and Internet Applications. Researchers examined reliability and validity using confirmatory factor analysis. The final version of the developed computer literacy scale included six constructs (software, hardware, multimedia, networks, information ethics, and information security) and 22 measurement items. Confirmatory factor analysis showed that the scale possessed good content validity, reliability, convergent validity, and discriminant validity. This study also found that participants earned the highest scores for the network domain and the lowest score for the hardware domain. With increasing use of information technology applications, courses related to hardware topic should be increased to improve nurse problem-solving abilities. This study recommends that emphases on word processing and network-related topics may be reduced in favor of an increased emphasis on database, statistical software, hospital information systems, and information ethics.

  9. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    Science.gov (United States)

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  10. Development of Adjustable 3D computational phantoms for breast radiotherapy

    International Nuclear Information System (INIS)

    Emam, Zohal Alnour Ahmed

    2016-06-01

    Radiotherapy has become an essential part of breast cancer treatment and it was given a great concern during last decades due to aspects of managing breast cancer successfully, reducing recurrence and breast cancer mortality. Monte Carlo simulation has been used heavily in this issue. To use monte Carlo the suitable data set must be found to perform the study. This process is not straight forward and difficult to achieve and an effort is needed to obtain it. In this work we aimed to develop a methodology for obtaining 3D adjustable computational phantoms with different breast sizes to treat this problem. At first make human software was used to generate outer surfaces models with desired anthropomorphic features for our purpose. Three breasts cup sizes have been developed: small (A), medium (C) and large (D) according to European standardization system of dress, then blender software was used to join skeleton and internal organs outer surfaces of the body models in correct anatomical positions and the results were poly mesh anthropomorphic phantom has three breast sizes easy to manipulate positioning and modifying, the prepared models have been voxelised in 3D matrixes (256*256*256) using Binvox software, then voxelised models prepared in suitable formats for Gate (mhd/raw) in 70 axial slice with voxel dimension of 1.394*1.394*5 mm 3 for width, depth and length respectively. Gate monte Carlo was used to simulate the irradiation of virtual tumor bed site in left breasts with direct field electron beam, each breast size was treated with five energies 6, 9, 12, 15, and 18 MeV by field size 5*5 cm 2 , and 100 cm source surface distance (SSD). The results were studied to evaluate the effect of breast size variation on dose distribution. According to criteria of tumor bed coverage by 100% 90% normalised maximum dose and minimum dose to heart and lug which are considering the organs at risks, results show the energy 6 MeV give under cover to tumor bed in the small, medium

  11. Shlaer-Mellor object-oriented analysis and recursive design, an effective modern software development method for development of computing systems for a large physics detector

    International Nuclear Information System (INIS)

    Kozlowski, T.; Carey, T.A.; Maguire, C.F.

    1995-01-01

    After evaluation of several modern object-oriented methods for development of the computing systems for the PHENIX detector at RHIC, we selected the Shlaer-Mellor Object-Oriented Analysis and Recursive Design method as the most appropriate for the needs and development environment of a large nuclear or high energy physics detector. This paper discusses our specific needs and environment, our method selection criteria, and major features and components of the Shlaer-Mellor method

  12. Program Criteria Specifications Document. Computer Program TWDA for Design and Analysis of Inverted-T Retaining Walls and Floodwalls.

    Science.gov (United States)

    1981-02-01

    or analysis IloduIls,* each pCr forming one specific step in the design or analysis process. These modules will be callable , in any logical sequence...tempt to 1)l 1cC Cind cut of I bar, hut Will slow the required steel area and bond r i u I rl- t t)s per I oot at Uitablt intervals across the base... bond strength) shall be as required in ACI 318-71 Chapter 12, except that computed shear V shall be multiplied by 2.0 and substituted for V u. Tn

  13. Development of system of computer codes for severe accident analysis and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H S; Jeon, M H; Cho, N J. and others [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1992-01-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts.

  14. Development of system of computer codes for severe accident analysis and its applications

    International Nuclear Information System (INIS)

    Jang, H. S.; Jeon, M. H.; Cho, N. J. and others

    1992-01-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in nuclear power plants. This system of codes is necessary to conduct Individual Plant Examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident-resistance. Severe accident can be mitigated by the proper accident management strategies. Some operator action for mitigation can lead to more disastrous result and thus uncertain severe accident phenomena must be well recognized. There must be further research for development of severe accident management strategies utilizing existing plant resources as well as new design concepts

  15. Software development for specific geometry and safe design of isotropic material multicell beams

    International Nuclear Information System (INIS)

    Tariq, M.M.; Ahmed, M.A.

    2011-01-01

    Comparison of analytical results with finite element results for analysis of isotropic material multicell beams subjected to free torsion case is the main idea of this paper. Progress in the fundamentals and applications of advanced materials and their processing technologies involves costly experiments and prototype testing for reliability. The software development for design analysis of structures with advanced materials is a low cost but challenging research. Multicell beams have important industrial applications in the aerospace and automotive sectors. This paper explains software development to test different materials in design of a multicell beam. Objective of this paper is to compute the torsional loading of multicell beams of isotropic materials for safe design in both symmetrical and asymmetrical geometries. Software has been developed in Microsoft Visual Basic. Distribution of Saint Venant shear flows, shear stresses, factors of safety, volume, mass, weight, twist, polar moment of inertia and aspect ratio for free torsion in multicell beam have been calculated using this software. The software works on four algorithms, these are, Specific geometry algorithm, material selection algorithm, factor of safety algorithm and global algorithm. User can specify new materials analytically, or choose a pre-defined material from the list, which includes, plain carbon steels, low alloy steels, stainless steels, cast irons, aluminum alloys, copper alloys, magnesium alloys, titanium alloys, precious metals and refractory metals. Although this software is restricted to multicell beam comprising of three cells, however future versions can have ability to address more complicated shapes and cases of multicell beams. Software also describes nomenclature and mathematical formulas applied to help user understand the theoretical background. User can specify geometry of multicell beam for three rectangular cells. Software computes shear flows, shear stresses, safety factors

  16. The development and application of a coincidence measurement apparatus with micro-computer system

    International Nuclear Information System (INIS)

    Du Hongshan; Zhou Youpu; Gao Junlin; Qin Deming; Cao Yunzheng; Zhao Shiping

    1987-01-01

    A coincidence measurement apparatus with micro-computer system is developed. Automatic data acquisition and processing are achieved. Results of its application for radioactive measurement are satisfactory

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  18. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  19. Rapid mental computation system as a tool for algorithmic thinking of elementary school students development

    OpenAIRE

    Ziatdinov, Rushan; Musa, Sajid

    2013-01-01

    In this paper, we describe the possibilities of using a rapid mental computation system in elementary education. The system consists of a number of readily memorized operations that allow one to perform arithmetic computations very quickly. These operations are actually simple algorithms which can develop or improve the algorithmic thinking of pupils. Using a rapid mental computation system allows forming the basis for the study of computer science in secondary school.

  20. Complex Osteotomies of Tibial Plateau Malunions Using Computer-Assisted Planning and Patient-Specific Surgical Guides.

    Science.gov (United States)

    Fürnstahl, Philipp; Vlachopoulos, Lazaros; Schweizer, Andreas; Fucentese, Sandro F; Koch, Peter P

    2015-08-01

    The accurate reduction of tibial plateau malunions can be challenging without guidance. In this work, we report on a novel technique that combines 3-dimensional computer-assisted planning with patient-specific surgical guides for improving reliability and accuracy of complex intraarticular corrective osteotomies. Preoperative planning based on 3-dimensional bone models was performed to simulate fragment mobilization and reduction in 3 cases. Surgical implementation of the preoperative plan using patient-specific cutting and reduction guides was evaluated; benefits and limitations of the approach were identified and discussed. The preliminary results are encouraging and show that complex, intraarticular corrective osteotomies can be accurately performed with this technique. For selective patients with complex malunions around the tibia plateau, this method might be an attractive option, with the potential to facilitate achieving the most accurate correction possible.

  1. Development of x-ray computed tomographic scanner for iron and steel

    International Nuclear Information System (INIS)

    Taguchi, Isamu; Nakamura, Shigeo.

    1985-01-01

    X-ray computed tomography is extensively used in medicine, but has rarely been applied to non-medical purposes. Steel specimens pose particularly difficult problems-very poor transmission of X-rays and the need for high resolving capability. There has thus been no effective tomographic method of examining steel specimens. Due to the growing need for non-destructive, non-contact methods for observing and analyzing the internal conditions of steel microscopically, however, we have developed an X-ray Computed Tomographic Scanner for Steel (CTS) system, specifically for examination of steel specimens. Its major specifications and functions are as follows. Type: the second-generation CT, 8-channels, Scanning method: 6 0 revolution, 30-times traversing, Slice width: 0.5 mm, Resolving capability: 0.25 x 0.25 mm, X-ray source: 420 kV, 3 mA, X-ray detector: BGO scintillator, Standard specimen shape: 50 mm dia., 100 mm high, Measuring time: 10.5 min. Porosity of a stainless steel (SUS 304) bloom was examined three-dimensionally by the CTS system. Corrosion procedure of a steel slab was also examined. (author)

  2. Affordances of the 'branch and bound' paradigm for developing computational thinking

    NARCIS (Netherlands)

    van der Meulen, Joris; Timmer, Mark

    As technological advances in engineering and computer science happen more and more quickly, we must shift focus from teaching specific techniques or programming languages to teaching something more transcending: computational thinking (Wing, 2006). Wing explained this concept later as “the thought

  3. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  4. Linear systems solvers - recent developments and implications for lattice computations

    International Nuclear Information System (INIS)

    Frommer, A.

    1996-01-01

    We review the numerical analysis' understanding of Krylov subspace methods for solving (non-hermitian) systems of equations and discuss its implications for lattice gauge theory computations using the example of the Wilson fermion matrix. Our thesis is that mature methods like QMR, BiCGStab or restarted GMRES are close to optimal for the Wilson fermion matrix. Consequently, preconditioning appears to be the crucial issue for further improvements. (orig.)

  5. Development of a Computational Assay for the Estrogen Receptor

    Science.gov (United States)

    2006-07-01

    Research in Undergraduate Computational Chemistry across New York and New England" at the Executive Summit, Delivering Technology Leadership for Life...Sciences: Research Advances for Drug Discovery and Bioterrorism. A Life Sciences Executive Summit co-sponsored by SGI, the Delaware Biotechnology...Bharath, J.; Jain, S.; Pham, H. B.; Boonyasiriwat, C.; Nguyen, N.; Andersen, E.; Kim, Y.; Choc , S.; Choi, J.; Cheatham, T. E.; Facelli, J. C. J. Chemical

  6. Computer Instruction in Handwriting, Spelling, and Composing for Students with Specific Learning Disabilities in Grades 4 to 9.

    Science.gov (United States)

    Berninger, Virginia W; Nagy, William; Tanimoto, Steve; Thompson, Rob; Abbott, Robert D

    2015-02-01

    Effectiveness of iPad computerized writing instruction was evaluated for 4 th to 9 th graders ( n =35) with diagnosed specific learning disabilities (SLDs) affecting writing: dysgraphia (impaired handwriting), dyslexia (impaired spelling), and oral and written language learning disability (OWL LD) (impaired syntax composing). Each of the 18 two-hour lessons had multiple learning activities aimed at improving subword - (handwriting), word - (spelling), and syntax - (sentence composing) level language skills by engaging all four language systems (listening, speaking, reading, and writing) to create a functional writing system. To evaluate treatment effectiveness, normed measures of handwriting, spelling, and composing were used with the exception of one non-normed alphabet writing task. Results showed that the sample as a whole improved significantly from pretest to posttest in three handwriting measures, four spelling measures, and both written and oral syntax construction measures. All but oral syntax was evaluated with pen and paper tasks, showing that the computer writing instruction transferred to better writing with pen and paper. Performance on learning activities during instruction correlated with writing outcomes; and individual students tended to improve in the impaired skill associated with their diagnosis. Thus, although computers are often used in upper elementary school and middle school in the United States (US) for accommodations (alternatives to pen and paper) for students with persisting SLDs affecting writing, this study shows computers can also be used for Tier 3 instruction to improve the writing skills of students in grades 4 to 9 with history of persisting writing disabilities.

  7. Computer Instruction in Handwriting, Spelling, and Composing for Students with Specific Learning Disabilities in Grades 4 to 9

    Science.gov (United States)

    Berninger, Virginia W.; Nagy, William; Tanimoto, Steve; Thompson, Rob; Abbott, Robert D.

    2014-01-01

    Effectiveness of iPad computerized writing instruction was evaluated for 4th to 9th graders (n=35) with diagnosed specific learning disabilities (SLDs) affecting writing: dysgraphia (impaired handwriting), dyslexia (impaired spelling), and oral and written language learning disability (OWL LD) (impaired syntax composing). Each of the 18 two-hour lessons had multiple learning activities aimed at improving subword- (handwriting), word- (spelling), and syntax- (sentence composing) level language skills by engaging all four language systems (listening, speaking, reading, and writing) to create a functional writing system. To evaluate treatment effectiveness, normed measures of handwriting, spelling, and composing were used with the exception of one non-normed alphabet writing task. Results showed that the sample as a whole improved significantly from pretest to posttest in three handwriting measures, four spelling measures, and both written and oral syntax construction measures. All but oral syntax was evaluated with pen and paper tasks, showing that the computer writing instruction transferred to better writing with pen and paper. Performance on learning activities during instruction correlated with writing outcomes; and individual students tended to improve in the impaired skill associated with their diagnosis. Thus, although computers are often used in upper elementary school and middle school in the United States (US) for accommodations (alternatives to pen and paper) for students with persisting SLDs affecting writing, this study shows computers can also be used for Tier 3 instruction to improve the writing skills of students in grades 4 to 9 with history of persisting writing disabilities. PMID:25378768

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  9. Flow stagnation volume and abdominal aortic aneurysm growth: Insights from patient-specific computational flow dynamics of Lagrangian-coherent structures.

    Science.gov (United States)

    Joly, Florian; Soulez, Gilles; Garcia, Damien; Lessard, Simon; Kauffmann, Claude

    2018-01-01

    Abdominal aortic aneurysms (AAA) are localized, commonly-occurring dilations of the aorta. When equilibrium between blood pressure (loading) and wall mechanical resistance is lost, rupture ensues, and patient death follows, if not treated immediately. Experimental and numerical analyses of flow patterns in arteries show direct correlations between wall shear stress and wall mechano-adaptation with the development of zones prone to thrombus formation. For further insights into AAA flow topology/growth interaction, a workout of patient-specific computational flow dynamics (CFD) is proposed to compute finite-time Lyapunov exponents and extract Lagrangian-coherent structures (LCS). This computational model was first compared with 4-D phase-contrast magnetic resonance imaging (MRI) in 5 patients. To better understand the impact of flow topology and transport on AAA growth, hyperbolic, repelling LCS were computed in 1 patient during 8-year follow-up, including 9 volumetric morphologic AAA measures by computed tomography-angiography (CTA). LCS defined barriers to Lagrangian jet cores entering AAA. Domains enclosed between LCS and the aortic wall were considered to be stagnation zones. Their evolution was studied during AAA growth. Good correlation - 2-D cross-correlation coefficients of 0.65, 0.86 and 0.082 (min, max, SD) - was obtained between numerical simulations and 4-D MRI acquisitions in 6 specific cross-sections from 4 patients. In follow-up study, LCS divided AAA lumens into 3 dynamically-isolated zones: 2 stagnation volumes lying in dilated portions of the AAA, and circulating volume connecting the inlet to the outlet. The volume of each zone was tracked over time. Although circulating volume remained unchanged during 8-year follow-up, the AAA lumen and main stagnation zones grew significantly (8 cm 3 /year and 6 cm 3 /year, respectively). This study reveals that transient transport topology can be quantified in patient-specific AAA during disease progression

  10. Patient specific anatomy: the new area of anatomy based on computer science illustrated on liver.

    Science.gov (United States)

    Soler, Luc; Mutter, Didier; Pessaux, Patrick; Marescaux, Jacques

    2015-01-01

    Over the past century, medical imaging has brought a new revolution: internal anatomy of a patient could be seen without any invasive technique. This revolution has highlighted the two main limits of current anatomy: the anatomical description is physician dependent, and the average anatomy is more and more frequently insufficient to describe anatomical variations. These drawbacks can sometimes be so important that they create mistakes but they can be overcome through the use of 3D patient-specific surgical anatomy. In this article, we propose to illustrate such improvement of standard anatomy on liver. We first propose a general scheme allowing to easily compare the four main liver anatomical descriptions by Takasaki, Goldsmith and Woodburne, Bismuth and Couinaud. From this general scheme we propose four rules to apply in order to correct these initial anatomical definitions. Application of these rules allows to correct usual vascular topological mistakes of standard anatomy. We finally validate such correction on a database of 20 clinical cases compared to the 111 clinical cases of a Couinaud article. Out of the 20 images of the database, we note a revealing difference in 14 cases (70%) on at least one important branch of the portal network. Only six cases (30%) do not present a revealing difference between both labellings. We also show that the right portal fissure location on our 20 cases defined between segment V and VI of our anatomical definition is well correlated with the real position described by Couinaud on 111 cases, knowing that the theoretical position was only found in 46 cases out of 111, i.e., 41.44% of cases with the non-corrected Couinaud definition. We have proposed a new anatomical segmentation of the liver based on four main rules to apply in order to correct topological errors of the four main standard segmentations. Our validation clearly illustrates that this new definition corrects the large amount of mistakes created by the current

  11. Computational Biology Tools for Identifying Specific Ligand Binding Residues for Novel Agrochemical and Drug Design.

    Science.gov (United States)

    Neshich, Izabella Agostinho Pena; Nishimura, Leticia; de Moraes, Fabio Rogerio; Salim, Jose Augusto; Villalta-Romero, Fabian; Borro, Luiz; Yano, Inacio Henrique; Mazoni, Ivan; Tasic, Ljubica; Jardine, Jose Gilberto; Neshich, Goran

    2015-01-01

    The term "agrochemicals" is used in its generic form to represent a spectrum of pesticides, such as insecticides, fungicides or bactericides. They contain active components designed for optimized pest management and control, therefore allowing for economically sound and labor efficient agricultural production. A "drug" on the other side is a term that is used for compounds designed for controlling human diseases. Although drugs are subjected to much more severe testing and regulation procedures before reaching the market, they might contain exactly the same active ingredient as certain agrochemicals, what is the case described in present work, showing how a small chemical compound might be used to control pathogenicity of Gram negative bacteria Xylella fastidiosa which devastates citrus plantations, as well as for control of, for example, meningitis in humans. It is also clear that so far the production of new agrochemicals is not benefiting as much from the in silico new chemical compound identification/discovery as pharmaceutical production. Rational drug design crucially depends on detailed knowledge of structural information about the receptor (target protein) and the ligand (drug/agrochemical). The interaction between the two molecules is the subject of analysis that aims to understand relationship between structure and function, mainly deciphering some fundamental elements of the nanoenvironment where the interaction occurs. In this work we will emphasize the role of understanding nanoenvironmental factors that guide recognition and interaction of target protein and its function modifier, an agrochemical or a drug. The repertoire of nanoenvironment descriptors is used for two selected and specific cases we have approached in order to offer a technological solution for some very important problems that needs special attention in agriculture: elimination of pathogenicity of a bacterium which is attacking citrus plants and formulation of a new fungicide. Finally

  12. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  13. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  14. Patient-specific reconstruction plates are the missing link in computer-assisted mandibular reconstruction: A showcase for technical description.

    Science.gov (United States)

    Cornelius, Carl-Peter; Smolka, Wenko; Giessler, Goetz A; Wilde, Frank; Probst, Florian A

    2015-06-01

    Preoperative planning of mandibular reconstruction has moved from mechanical simulation by dental model casts or stereolithographic models into an almost completely virtual environment. CAD/CAM applications allow a high level of accuracy by providing a custom template-assisted contouring approach for bone flaps. However, the clinical accuracy of CAD reconstruction is limited by the use of prebent reconstruction plates, an analogue step in an otherwise digital workstream. In this paper the integration of computerized, numerically-controlled (CNC) milled, patient-specific mandibular plates (PSMP) within the virtual workflow of computer-assisted mandibular free fibula flap reconstruction is illustrated in a clinical case. Intraoperatively, the bone segments as well as the plate arms showed a very good fit. Postoperative CT imaging demonstrated close approximation of the PSMP and fibular segments, and good alignment of native mandible and fibular segments and intersegmentally. Over a follow-up period of 12 months, there was an uneventful course of healing with good bony consolidation. The virtual design and automated fabrication of patient-specific mandibular reconstruction plates provide the missing link in the virtual workflow of computer-assisted mandibular free fibula flap reconstruction. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  15. Evaluation of the impact of organ-specific dose reduction on image quality in pediatric chest computed tomography

    International Nuclear Information System (INIS)

    Boos, Johannes; Kroepil, Patric; Klee, Dirk; Heusch, Philipp; Schimmoeller, Lars; Schaper, Joerg; Antoch, Gerald; Lanzman, Rotem S.

    2014-01-01

    Organ-specific dose reduction significantly reduces the radiation exposure of radiosensitive organs. The purpose of this study was to assess the impact of a novel organ-specific dose reduction algorithm on image quality of pediatric chest CT. We included 28 children (mean age 10.9 ± 4.8 years, range 3-18 years) who had contrast-enhanced chest CT on a 128-row scanner. CT was performed at 100 kV using automated tube current modulation and a novel organ-specific dose-reduction algorithm (XCare trademark; Siemens, Forchheim, Germany). Seven children had a previous chest CT performed on a 64-row scanner at 100 kV without organ-specific dose reduction. Subjective image quality was assessed using a five-point scale (1-not diagnostic; 5-excellent). Contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) were assessed in the descending aorta. Overall mean subjective image quality was 4.1 ± 0.6. In the subgroup of the seven children examined both with and without organ-specific dose reduction, subjective image quality was comparable (score 4.4 ± 0.5 with organ-specific dose reduction vs. 4.4 ± 0.7 without it; P > 0.05). There was no significant difference in mean signal-to-noise ratio and contrast-to-noise ratio with organ-specific dose reduction (38.3 ± 10.1 and 28.5 ± 8.7, respectively) and without the reduction (35.5 ± 8.5 and 26.5 ± 7.8, respectively) (P > 0.05). Volume computed tomography dose index (CTDI vol ) and size-specific dose estimates did not differ significantly between acquisitions with the organ-specific dose reduction (1.7 ± 0.8 mGy) and without the reduction (1.7 ± 0.8 mGy) (P > 0.05). Organ-specific dose reduction does not have an impact on image quality of pediatric chest CT and can therefore be used in clinical practice to reduce radiation dose of radiosensitive organs such as breast and thyroid gland. (orig.)

  16. Development of a patient-specific two-compartment anthropomorphic breast phantom

    International Nuclear Information System (INIS)

    Prionas, Nicolas D; Burkett, George W; McKenney, Sarah E; Chen, Lin; Boone, John M; Stern, Robin L

    2012-01-01

    The purpose of this paper is to develop a technique for the construction of a two-compartment anthropomorphic breast phantom specific to an individual patient's pendant breast anatomy. Three-dimensional breast images were acquired on a prototype dedicated breast computed tomography (bCT) scanner as part of an ongoing IRB-approved clinical trial of bCT. The images from the breast of a patient were segmented into adipose and glandular tissue regions and divided into 1.59 mm thick breast sections to correspond to the thickness of polyethylene stock. A computer-controlled water-jet cutting machine was used to cut the outer breast edge and the internal regions corresponding to glandular tissue from the polyethylene. The stack of polyethylene breast segments was encased in a thermoplastic ‘skin’ and filled with water. Water-filled spaces modeled glandular tissue structures and the surrounding polyethylene modeled the adipose tissue compartment. Utility of the phantom was demonstrated by inserting 200 µm microcalcifications as well as by measuring point dose deposition during bCT scanning. Affine registration of the original patient images with bCT images of the phantom showed similar tissue distribution. Linear profiles through the registered images demonstrated a mean coefficient of determination (r 2 ) between grayscale profiles of 0.881. The exponent of the power law describing the anatomical noise power spectrum was identical in the coronal images of the patient's breast and the phantom. Microcalcifications were visualized in the phantom at bCT scanning. The real-time air kerma rate was measured during bCT scanning and fluctuated with breast anatomy. On average, point dose deposition was 7.1% greater than the mean glandular dose. A technique to generate a two-compartment anthropomorphic breast phantom from bCT images has been demonstrated. The phantom is the first, to our knowledge, to accurately model the uncompressed pendant breast and the glandular tissue

  17. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  18. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-01-01

    of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation

  19. Exploring the Effects of Web-Mediated Computational Thinking on Developing Students' Computing Skills in a Ubiquitous Learning Environment

    Science.gov (United States)

    Tsai, Chia-Wen; Shen, Pei-Di; Tsai, Meng-Chuan; Chen, Wen-Yu

    2017-01-01

    Much application software education in Taiwan can hardly be regarded as practical. The researchers in this study provided a flexible means of ubiquitous learning (u-learning) with a mobile app for students to access the learning material. In addition, the authors also adopted computational thinking (CT) to help students develop practical computing…

  20. Patient-specific surgical planning and hemodynamic computational fluid dynamics optimization through free-form haptic anatomy editing tool (SURGEM).

    Science.gov (United States)

    Pekkan, Kerem; Whited, Brian; Kanter, Kirk; Sharma, Shiva; de Zelicourt, Diane; Sundareswaran, Kartik; Frakes, David; Rossignac, Jarek; Yoganathan, Ajit P

    2008-11-01

    The first version of an anatomy editing/surgical planning tool (SURGEM) targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel three-dimensional (3D) shape editing concepts and human-shape interaction technologies have been integrated to facilitate interactive surgical morphology alterations, grid generation and CFD analysis. In order to implement "manual hemodynamic optimization" at the surgery planning phase for patients with congenital heart defects, these tools are applied to design and evaluate possible modifications of patient-specific anatomies. In this context, anatomies involve complex geometric topologies and tortuous 3D blood flow pathways with multiple inlets and outlets. These tools make it possible to freely deform the lumen surface and to bend and position baffles through real-time, direct manipulation of the 3D models with both hands, thus eliminating the tedious and time-consuming phase of entering the desired geometry using traditional computer-aided design (CAD) systems. The 3D models of the modified anatomies are seamlessly exported and meshed for patient-specific CFD analysis. Free-formed anatomical modifications are quantified using an in-house skeletization based cross-sectional geometry analysis tool. Hemodynamic performance of the systematically modified anatomies is compared with the original anatomy using CFD. CFD results showed the relative importance of the various surgically created features such as pouch size, vena cave to pulmonary artery (PA) flare and PA stenosis. An interactive surgical-patch size estimator is also introduced. The combined design/analysis cycle time is used for comparing and optimizing surgical plans and improvements are tabulated. The reduced cost of patient-specific shape design and analysis process, made it possible to envision large clinical studies to assess the validity of predictive patient-specific CFD simulations. In this paper, model

  1. Concept of development of integrated computer - based control system for 'Ukryttia' object

    International Nuclear Information System (INIS)

    Buyal'skij, V.M.; Maslov, V.P.

    2003-01-01

    The structural concept of Chernobyl NPP 'Ukryttia' Object's integrated computer - based control system development is presented on the basis of general concept of integrated Computer - based Control System (CCS) design process for organizing and technical management subjects.The concept is aimed at state-of-the-art architectural design technique application and allows using modern computer-aided facilities for functional model,information (logical and physical) models development,as well as for system object model under design

  2. Development of a guidance guide for dosimetry in computed tomography

    International Nuclear Information System (INIS)

    Fontes, Ladyjane Pereira

    2016-01-01

    Due to frequent questions from users of ionization chambers pencil type calibrated in the Instrument Calibration Laboratory of the Institute of Energy and Nuclear Research (LCI - IPEN), on how to properly apply the factors indicated in their calibration certificates, a guide was prepared guidance for dosimetry in computed tomography. The guide includes guidance prior knowledge of half value layer (HVL), as it is necessary to know the effective beam energy for application quality for correction factor (kq). The evaluation of HVL in TC scanners becomes a difficult task due to system geometry and therefore a survey was conducted of existing methodologies for the determination of HVL in clinical beams Computed Tomography, taking into account technical, practical and economic factors. In this work it was decided to test a Tandem System consists of absorbing covers made in the workshop of IPEN, based on preliminary studies due to low cost and good response. The Tandem system consists of five cylindrical absorbing layers of 1mm, 3mm, 5mm, 7mm and 10mm aluminum and 3 cylindrical absorbing covers 15mm, 25mm and acrylic 35mm (PMMA) coupled to the ionization chamber of commercial pencil type widely used in quality control tests in dosimetry in clinical beams Computed tomography. Through Tandem curves it was possible to assess HVL values and from the standard curve pencil-type ionization chamber, Kq find the appropriate beam. The elaborate Guide provides information on how to build the calibration curve on the basis of CSR, to find the Kq and information for construction Tandem curve, to find values close to CSR. (author)

  3. COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE

    Directory of Open Access Journals (Sweden)

    Constantin LUPU

    2015-07-01

    Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.

  4. Development validation and use of computer codes for inelastic analysis

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    A finite element scheme is a system which provides routines so carry out the operations which are common to all finite element programs. The list of items that can be provided as standard by the finite element scheme is surprisingly large and the list provided by the UNCLE finite element scheme is unusually comprehensive. This presentation covers the following: construction of the program, setting up a finite element mesh, generation of coordinates, incorporating boundary and load conditions. Program validation was done by creep calculations performed using CAUSE code. Program use is illustrated by calculating a typical inelastic analysis problem. This includes computer model of the PFR intermediate heat exchanger

  5. Development of computer systems for planning and management of reactor decommissioning

    International Nuclear Information System (INIS)

    Yanagihara, Satoshi; Sukegawa, Takenori; Shiraishi, Kunio

    2001-01-01

    The computer systems for planning and management of reactor decommissioning were developed for effective implementation of a decommissioning project. The systems are intended to be applied to construction of work breakdown structures and estimation of manpower needs, worker doses, etc. based on the unit productivity and work difficulty factors, which were developed by analyzing the actual data on the JPDR dismantling activities. In addition, information necessary for project planning can be effectively integrated as a graphical form on a computer screen by transferring the data produced by subprograms such as radioactive inventory and dose rate calculation routines among the systems. Expert systems were adopted for modeling a new decommissioning project using production rules by reconstructing work breakdown structures and work specifications. As the results, the systems were characterized by effective modeling of a decommissioning project, project management data estimation based on feedback of past experience, and information integration through the graphical user interface. On the other hands, the systems were validated by comparing the calculated results with the actual manpower needs of the JPDR dismantling activities; it is expected that the systems will be applicable to planning and evaluation of other decommissioning projects. (author)

  6. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  7. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  8. Biofeedback effectiveness to reduce upper limb muscle activity during computer work is muscle specific and time pressure dependent

    DEFF Research Database (Denmark)

    Vedsted, Pernille; Søgaard, Karen; Blangsted, Anne Katrine

    2011-01-01

    trapezius (TRA) can reduce bilateral TRA activity but not extensor digitorum communis (EDC) activity; (2) biofeedback from EDC can reduce activity in EDC but not in TRA; (3) biofeedback is more effective in no time constraint than in the time constraint working condition. Eleven healthy women performed......Continuous electromyographic (EMG) activity level is considered a risk factor in developing muscle disorders. EMG biofeedback is known to be useful in reducing EMG activity in working muscles during computer work. The purpose was to test the following hypotheses: (1) unilateral biofeedback from...... computer work during two different working conditions (time constraint/no time constraint) while receiving biofeedback. Biofeedback was given from right TRA or EDC through two modes (visual/auditory) by the use of EMG or mechanomyography as biofeedback source. During control sessions (no biofeedback), EMG...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  10. Application of a B ampersand W developed computer aided pictorial process planning system to CQMS for manufacturing process control

    International Nuclear Information System (INIS)

    Johanson, D.C.; VandeBogart, J.E.

    1992-01-01

    Babcock ampersand Wilcox (B ampersand W) will utilize its internally developed Computer Aided Pictorial Process Planning or CAPPP (pronounced open-quotes cap cubedclose quotes) system to create a paperless manufacturing environment for the Collider Quadruple Magnets (CQM). The CAPPP system consists of networked personal computer hardware and software used to: (1) generate and maintain the documents necessary for product fabrication, (2) communicate the information contained in these documents to the production floor, and (3) obtain quality assurance and manufacturing feedback information from the production floor. The purpose of this paper is to describe the various components of the CAPPP system and explain their applicability to product fabrication, specifically quality assurance functions

  11. The Development of Educational and/or Training Computer Games for Students with Disabilities

    Science.gov (United States)

    Kwon, Jungmin

    2012-01-01

    Computer and video games have much in common with the strategies used in special education. Free resources for game development are becoming more widely available, so lay computer users, such as teachers and other practitioners, now have the capacity to develop games using a low budget and a little self-teaching. This article provides a guideline…

  12. Enabling Customization through Web Development: An Iterative Study of the Dell Computer Corporation Website

    Science.gov (United States)

    Liu, Chang; Mackie, Brian G.

    2008-01-01

    Throughout the last decade, companies have increased their investment in electronic commerce (EC) by developing and implementing Web-based applications on the Internet. This paper describes a class project to develop a customized computer website which is similar to Dell Computer Corporation's (Dell) website. The objective of this project is to…

  13. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  14. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    Science.gov (United States)

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  15. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1992-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two- fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, poor load balancing will degrade efficiency on either vector or data parallel architectures when the data are organized according to spatial location. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. This document discusses why developers algorithms, such as a neural net representation, that do not exhibit algorithms, such as a neural net representation, that do not exhibit load-balancing problems

  16. Part 2 of the summary for the electronics, DAQ, and computing working group: Technological developments

    International Nuclear Information System (INIS)

    Slaughter, A.J.

    1993-01-01

    The attraction of hadron machines as B factories is the copious production of B particles. However, the interesting physics lies in specific rare final states. The challenge is selecting and recording the interesting ones. Part 1 of the summary for this working group, open-quote Comparison of Trigger and Data Acquisition Parameters for Future B Physics Experiments close-quote summarizes and compares the different proposals. In parallel with this activity, the working group also looked at a number of the technological developments being proposed to meet the trigger and DAQ requirements. The presentations covered a wide variety of topics, which are grouped into three categories: (1) front-end electronics, (2) level 0 fast triggers, and (3) trigger and vertex processors. The group did not discuss on-line farms or offine data storage and computing due to lack of time

  17. Stimulus specificity of a steady-state visual-evoked potential-based brain-computer interface

    Science.gov (United States)

    Ng, Kian B.; Bradley, Andrew P.; Cunnington, Ross

    2012-06-01

    The mechanisms of neural excitation and inhibition when given a visual stimulus are well studied. It has been established that changing stimulus specificity such as luminance contrast or spatial frequency can alter the neuronal activity and thus modulate the visual-evoked response. In this paper, we study the effect that stimulus specificity has on the classification performance of a steady-state visual-evoked potential-based brain-computer interface (SSVEP-BCI). For example, we investigate how closely two visual stimuli can be placed before they compete for neural representation in the cortex and thus influence BCI classification accuracy. We characterize stimulus specificity using the four stimulus parameters commonly encountered in SSVEP-BCI design: temporal frequency, spatial size, number of simultaneously displayed stimuli and their spatial proximity. By varying these quantities and measuring the SSVEP-BCI classification accuracy, we are able to determine the parameters that provide optimal performance. Our results show that superior SSVEP-BCI accuracy is attained when stimuli are placed spatially more than 5° apart, with size that subtends at least 2° of visual angle, when using a tagging frequency of between high alpha and beta band. These findings may assist in deciding the stimulus parameters for optimal SSVEP-BCI design.

  18. High Performance Computing (HPC) Challenge (HPCC) Benchmark Suite Development

    National Research Council Canada - National Science Library

    Dongarra, J. J

    2005-01-01

    .... The applications of performance modeling are numerous, including evaluation of algorithms, optimization of code implementation, parallel library development, and comparison of system architectures...

  19. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  20. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    Energy Technology Data Exchange (ETDEWEB)

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.