WorldWideScience

Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Energy Technology Data Exchange (ETDEWEB)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Computers for Your Classroom: CAI and CMI.

    Science.gov (United States)

    Thomas, David B.; Bozeman, William C.

    1981-01-01

    The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…

  3. Perancangan Perangkat Lunak Media Pembelajaran Menggunakan Computer Assisted Instruction (CAI untuk Pembelajaran Ilmu Tajwid Berbasis Web

    Directory of Open Access Journals (Sweden)

    Fenny Purwani

    2016-03-01

    Full Text Available Strategi penggunaan Computer Assisted Instruction (CAI sebagai media pembelajaran dibutuhkan untuk mengatasi permasalahan yang muncul dalam proses pembelajaran. Pembelajaran yang dikemas dengan baik memberikan dampak yang positif dalam memajukan potensi pada diri manusia. CAI sebagai media pembelajaran berbasis computer dibangun sebagai pelengkap dan pendukung metode pembelajaran yang selama ini hanya menggunakan metode ceramah, diskusi informasi dan demonstrasi. Tujuan penelitian ini adalah merancang dan membangun media pembelajaran CAI yang interaktif dengan berbasis Web. Kemudian hasilnya berupa rancangan CAI dengan model tutorial, serta dilengkapi dengan latihan soal-soal dari materi yang diberikan. Perancangan CAI ini kemudian digunakan untuk media pembelajaran ilmu Tajwid dengan komputer. Strategic use of Computer Assisted Instruction (CAI as a learning media needed to overcome the problems that appeared in the learning process. Learning that was packaged well gave a positive impact in advancing the potential in human beings. CAI as a computer-based learning media was built to complement and support the learning method which as long as only used the speech, discussions, information and demonstrations method. The purpose of this study was to design and build learning media of CAI which was interactive with Web-based. Then the result was a design of CAI with tutorial model and completed with practicing questions from the material provided. This CAI design was then used for learning media of Tajwid with computer.

  4. Generative Computer Assisted Instruction: An Application of Artificial Intelligence to CAI.

    Science.gov (United States)

    Koffman, Elliot B.

    Frame-oriented computer-assisted instruction (CAI) systems dominate the field, but these mechanized programed texts utilize the computational power of the computer to a minimal degree and are difficult to modify. Newer, generative CAI systems which are supplied with a knowledge of subject matter can generate their own problems and solutions, can…

  5. Rancangan Perangkat Lunak Computer Assisted Instruction (CAI Untuk Ilmu Tajwid Berbasis Web

    Directory of Open Access Journals (Sweden)

    Fenny Purwani

    2015-08-01

    Full Text Available The development of information technology and science refer to the need of teching-learning concept and mechanism wich are based on information technology, undoubtedly. Regarding the development, it needs qualified human resources and flexible material changing and it should be appropriate with technology and science development. Additionaly, this combines between education based on religious and techology (IMTAK and IPTEK. Internet techology can be used as teaching tool which is known as Computer Assisted Intruction (CAI. CAI software might be one of media or tool in learnig tajwid and it can help people to learn Tajwid easier.

  6. Hypertext and three-dimensional computer graphics in an all digital PC-based CAI workstation.

    Science.gov (United States)

    Schwarz, D. L.; Wind, G. G.

    1991-01-01

    In the past several years there has been an enormous increase in the number of computer-assisted instructional (CAI) applications. Many medical educators and physicians have recognized the power and utility of hypertext. Some developers have incorporated simple diagrams, scanned monochrome graphics or still frame photographs from a laser disc or CD-ROM into their hypertext applications. These technologies have greatly increased the role of the microcomputer in education and training. There still remain numerous applications for these tools which are yet to be explored. One of these exciting areas involves the use of three-dimensional computer graphics. An all digital platform increases application portability. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:1807767

  7. A Design of Computer Aided Instructions (CAI) for Undirected Graphs in the Discrete Math Tutorial (DMT). Part 1.

    Science.gov (United States)

    1990-06-01

    The objective of this thesis research is to create a tutorial for teaching aspects of undirected graphs in discrete math . It is one of the submodules...of the Discrete Math Tutorial (DMT), which is a Computer Aided Instructional (CAI) tool for teaching discrete math to the Naval Academy and the

  8. A Design of Computer Aided Instructions (CAI) for Undirected Graphs in the Discrete Math Tutorial (DMT). Part 2

    Science.gov (United States)

    1990-06-01

    The objective of this thesis research is to create a tutorial for teaching aspects of undirected graphs in discrete math . It is one of the submodules...of the Discrete Math Tutorial (DMT), which is a Computer Aided Instructional (CAI) tool for teaching discrete math to the Naval Academy and the

  9. Exploring Chondrule and CAI Rims Using Micro- and Nano-Scale Petrological and Compositional Analysis

    Science.gov (United States)

    Cartwright, J. A.; Perez-Huerta, A.; Leitner, J.; Vollmer, C.

    2017-12-01

    As the major components within chondrites, chondrules (mm-sized droplets of quenched silicate melt) and calcium-aluminum-rich inclusions (CAI, refractory) represent the most abundant and the earliest materials that solidified from the solar nebula. However, the exact formation mechanisms of these clasts, and whether these processes are related, remains unconstrained, despite extensive petrological and compositional study. By taking advantage of recent advances in nano-scale tomographical techniques, we have undertaken a combined micro- and nano-scale study of CAI and chondrule rim morphologies, to investigate their formation mechanisms. The target lithologies for this research are Wark-Lovering rims (WLR), and fine-grained rims (FGR) around CAIs and chondrules respectively, present within many chondrites. The FGRs, which are up to 100 µm thick, are of particular interest as recent studies have identified presolar grains within them. These grains predate the formation of our Solar System, suggesting FGR formation under nebular conditions. By contrast, WLRs are 10-20 µm thick, made of different compositional layers, and likely formed by flash-heating shortly after CAI formation, thus recording nebular conditions. A detailed multi-scale study of these respective rims will enable us to better understand their formation histories and determine the potential for commonality between these two phases, despite reports of an observed formation age difference of up to 2-3 Myr. We are using a combination of complimentary techniques on our selected target areas: 1) Micro-scale characterization using standard microscopic and compositional techniques (SEM-EBSD, EMPA); 2) Nano-scale characterization of structures using transmission electron microscopy (TEM) and elemental, isotopic and tomographic analysis with NanoSIMS and atom probe tomography (APT). Preliminary nano-scale APT analysis of FGR morphologies within the Allende carbonaceous chondrite has successfully discerned

  10. CAI: Overcoming Attitude Barriers.

    Science.gov (United States)

    Netusil, Anton J.; Kockler, Lois H.

    During each of two school quarters, approximately 60 college students enrolled in a mathematics course were randomly assigned to an experimental group or a control group. The control group received instruction by the lecture method only; the experimental group received the same instruction, except that six computer-assisted instruction (CAI) units…

  11. Developing the Coach Analysis and Intervention System (CAIS): establishing validity and reliability of a computerised systematic observation instrument.

    Science.gov (United States)

    Cushion, Christopher; Harvey, Stephen; Muir, Bob; Nelson, Lee

    2012-01-01

    We outline the evolution of a computerised systematic observation tool and describe the process for establishing the validity and reliability of this new instrument. The Coach Analysis and Interventions System (CAIS) has 23 primary behaviours related to physical behaviour, feedback/reinforcement, instruction, verbal/non-verbal, questioning and management. The instrument also analyses secondary coach behaviour related to performance states, recipient, timing, content and questioning/silence. The CAIS is a multi-dimensional and multi-level mechanism able to provide detailed and contextualised data about specific coaching behaviours occurring in complex and nuanced coaching interventions and environments that can be applied to both practice sessions and competition.

  12. The Vibrio cholerae quorum-sensing autoinducer CAI-1: analysis of the biosynthetic enzyme CqsA

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, R.; Bolitho, M; Higgins, D; Lu, W; Ng, W; Jeffrey, P; Rabinowitz, J; Semmelhack, M; Hughson, F; Bassler, B

    2009-01-01

    Vibrio cholerae, the bacterium that causes the disease cholera, controls virulence factor production and biofilm development in response to two extracellular quorum-sensing molecules, called autoinducers. The strongest autoinducer, called CAI-1 (for cholera autoinducer-1), was previously identified as (S)-3-hydroxytridecan-4-one. Biosynthesis of CAI-1 requires the enzyme CqsA. Here, we determine the CqsA reaction mechanism, identify the CqsA substrates as (S)-2-aminobutyrate and decanoyl coenzyme A, and demonstrate that the product of the reaction is 3-aminotridecan-4-one, dubbed amino-CAI-1. CqsA produces amino-CAI-1 by a pyridoxal phosphate-dependent acyl-CoA transferase reaction. Amino-CAI-1 is converted to CAI-1 in a subsequent step via a CqsA-independent mechanism. Consistent with this, we find cells release {ge}100 times more CAI-1 than amino-CAI-1. Nonetheless, V. cholerae responds to amino-CAI-1 as well as CAI-1, whereas other CAI-1 variants do not elicit a quorum-sensing response. Thus, both CAI-1 and amino-CAI-1 have potential as lead molecules in the development of an anticholera treatment.

  13. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Directory of Open Access Journals (Sweden)

    Hsin-Yih Shyu

    1995-09-01

    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  14. Numerical simulation and validation of SI-CAI hybrid combustion in a CAI/HCCI gasoline engine

    Science.gov (United States)

    Wang, Xinyan; Xie, Hui; Xie, Liyan; Zhang, Lianfang; Li, Le; Chen, Tao; Zhao, Hua

    2013-02-01

    SI-CAI hybrid combustion, also known as spark-assisted compression ignition (SACI), is a promising concept to extend the operating range of CAI (Controlled Auto-Ignition) and achieve the smooth transition between spark ignition (SI) and CAI in the gasoline engine. In this study, a SI-CAI hybrid combustion model (HCM) has been constructed on the basis of the 3-Zones Extended Coherent Flame Model (ECFM3Z). An ignition model is included to initiate the ECFM3Z calculation and induce the flame propagation. In order to precisely depict the subsequent auto-ignition process of the unburned fuel and air mixture independently after the initiation of flame propagation, the tabulated chemistry concept is adopted to describe the auto-ignition chemistry. The methodology for extracting tabulated parameters from the chemical kinetics calculations is developed so that both cool flame reactions and main auto-ignition combustion can be well captured under a wider range of thermodynamic conditions. The SI-CAI hybrid combustion model (HCM) is then applied in the three-dimensional computational fluid dynamics (3-D CFD) engine simulation. The simulation results are compared with the experimental data obtained from a single cylinder VVA engine. The detailed analysis of the simulations demonstrates that the SI-CAI hybrid combustion process is characterised with the early flame propagation and subsequent multi-site auto-ignition around the main flame front, which is consistent with the optical results reported by other researchers. Besides, the systematic study of the in-cylinder condition reveals the influence mechanism of the early flame propagation on the subsequent auto-ignition.

  15. The Instructional Use of CAI in the Education of the Mentally Retarded.

    Science.gov (United States)

    Winters, John J., Jr.; And Others

    Computer assisted instruction (CAI) studies with the mentally retarded in the United States and Canada reveal that the retarded benefit from CAI in academic and social skills. Their learning is enhanced to the same extent as that of the nonretarded. CAI can be cost-effective, especially with the reduced costs of mini and micro-computers; however,…

  16. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Directory of Open Access Journals (Sweden)

    Sunny S. J. Lin

    1988-03-01

    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  17. The Relevance of AI Research to CAI.

    Science.gov (United States)

    Kearsley, Greg P.

    This article provides a tutorial introduction to Artificial Intelligence (AI) research for those involved in Computer Assisted Instruction (CAI). The general theme is that much of the current work in AI, particularly in the areas of natural language understanding systems, rule induction, programming languages, and socratic systems, has important…

  18. Comparison of computer-assisted instruction (CAI) versus traditional textbook methods for training in abdominal examination (Japanese experience).

    Science.gov (United States)

    Qayumi, A K; Kurihara, Y; Imai, M; Pachev, G; Seo, H; Hoshino, Y; Cheifetz, R; Matsuura, K; Momoi, M; Saleem, M; Lara-Guerra, H; Miki, Y; Kariya, Y

    2004-10-01

    This study aimed to compare the effects of computer-assisted, text-based and computer-and-text learning conditions on the performances of 3 groups of medical students in the pre-clinical years of their programme, taking into account their academic achievement to date. A fourth group of students served as a control (no-study) group. Participants were recruited from the pre-clinical years of the training programmes in 2 medical schools in Japan, Jichi Medical School near Tokyo and Kochi Medical School near Osaka. Participants were randomly assigned to 4 learning conditions and tested before and after the study on their knowledge of and skill in performing an abdominal examination, in a multiple-choice test and an objective structured clinical examination (OSCE), respectively. Information about performance in the programme was collected from school records and students were classified as average, good or excellent. Student and faculty evaluations of their experience in the study were explored by means of a short evaluation survey. Compared to the control group, all 3 study groups exhibited significant gains in performance on knowledge and performance measures. For the knowledge measure, the gains of the computer-assisted and computer-assisted plus text-based learning groups were significantly greater than the gains of the text-based learning group. The performances of the 3 groups did not differ on the OSCE measure. Analyses of gains by performance level revealed that high achieving students' learning was independent of study method. Lower achieving students performed better after using computer-based learning methods. The results suggest that computer-assisted learning methods will be of greater help to students who do not find the traditional methods effective. Explorations of the factors behind this are a matter for future research.

  19. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Natural gas diffusion through the cap rock is mainly by means ofdissolving in water, so its concentration can be replaced by solubility, which varies with temperature, pressure and salinity in strata. Under certain geological conditions the maximal solubility is definite, so the diffusion com-putation can be handled approximately by stable state equation. Furthermore, on the basis of the restoration of the paleo-buried history, the diffusion is calculated with the dynamic method, and the result is very close to the real diffusion value in the geological history.

  20. An Object-Oriented Architecture for a Web-Based CAI System.

    Science.gov (United States)

    Nakabayashi, Kiyoshi; Hoshide, Takahide; Seshimo, Hitoshi; Fukuhara, Yoshimi

    This paper describes the design and implementation of an object-oriented World Wide Web-based CAI (Computer-Assisted Instruction) system. The goal of the design is to provide a flexible CAI/ITS (Intelligent Tutoring System) framework with full extendibility and reusability, as well as to exploit Web-based software technologies such as JAVA, ASP (a…

  1. Coordinated Oxygen Isotopic and Petrologic Studies of CAIS Record Varying Composition of Protosolar

    Science.gov (United States)

    Simon, Justin I.; Matzel, J. E. P.; Simon, S. B.; Weber, P. K.; Grossman, L.; Ross, D. K.; Hutcheon, I. D.

    2012-01-01

    Ca-, Al-rich inclusions (CAIs) record the O-isotope composition of Solar nebular gas from which they grew [1]. High spatial resolution O-isotope measurements afforded by ion microprobe analysis across the rims and margin of CAIs reveal systematic variations in (Delta)O-17 and suggest formation from a diversity of nebular environments [2-4]. This heterogeneity has been explained by isotopic mixing between the O-16-rich Solar reservoir [6] and a second O-16-poor reservoir (probably nebular gas) with a "planetary-like" isotopic composition [e.g., 1, 6-7], but the mechanism and location(s) where these events occur within the protoplanetary disk remain uncertain. The orientation of large and systematic variations in (Delta)O-17 reported by [3] for a compact Type A CAI from the Efremovka reduced CV3 chondrite differs dramatically from reports by [4] of a similar CAI, A37 from the Allende oxidized CV3 chondrite. Both studies conclude that CAIs were exposed to distinct, nebular O-isotope reservoirs, implying the transfer of CAIs among different settings within the protoplanetary disk [4]. To test this hypothesis further and the extent of intra-CAI O-isotopic variation, a pristine compact Type A CAI, Ef-1 from Efremovka, and a Type B2 CAI, TS4 from Allende were studied. Our new results are equally intriguing because, collectively, O-isotopic zoning patterns in the CAIs indicate a progressive and cyclic record. The results imply that CAIs were commonly exposed to multiple environments of distinct gas during their formation. Numerical models help constrain conditions and duration of these events.

  2. INAA of CAIs from the Maralinga CK4 chondrite: Effects of parent body thermal metamorphism

    Science.gov (United States)

    Lindstrom, D. J.; Keller, L. P.; Martinez, R. R.

    1993-01-01

    Maralinga is an anomalous CK4 carbonaceous chondrite which contains numerous Ca-, Al-rich inclusions (CAI's) unlike the other members of the CK group. These CAI's are characterized by abundant green hercynitic spinel intergrown with plagioclase and high-Ca clinopyroxene, and a total lack of melilite. Instrumental Neutron Activation Analysis (INAA) was used to further characterize the meteorite, with special focus on the CAI's. High sensitivity INAA was done on eight sample disks about 100-150 microns in diameter obtained from a normal 30 micron thin section with a diamond microcoring device. The CAI's are enriched by 60-70X bulk meteorite values in Zn, suggesting that the substantial exchange of Fe for Mg that made the spinel in the CAI's hercynitic also allowed efficient scavenging of Zn from the rest of the meteorite during parent body thermal metamorphism. Less mobile elements appear to have maintained their initial heterogeneity.

  3. A risk management approach to CAIS development

    Science.gov (United States)

    Hart, Hal; Kerner, Judy; Alden, Tony; Belz, Frank; Tadman, Frank

    1986-01-01

    The proposed DoD standard Common APSE Interface Set (CAIS) was developed as a framework set of interfaces that will support the transportability and interoperability of tools in the support environments of the future. While the current CAIS version is a promising start toward fulfilling those goals and current prototypes provide adequate testbeds for investigations in support of completing specifications for a full CAIS, there are many reasons why the proposed CAIS might fail to become a usable product and the foundation of next-generation (1990'S) project support environments such as NASA's Space Station software support environment. The most critical threats to the viability and acceptance of the CAIS include performance issues (especially in piggybacked implementations), transportability, and security requirements. To make the situation worse, the solution to some of these threats appears to be at conflict with the solutions to others.

  4. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  5. CAI System with Multi-Media Text Through Web Browser for NC Lathe Programming

    Science.gov (United States)

    Mizugaki, Yoshio; Kikkawa, Koichi; Mizui, Masahiko; Kamijo, Keisuke

    A new Computer Aided Instruction (CAI) system for NC lathe programming has been developed with use of multi-media texts including movies, animations, pictures, sound and texts through Web browser. Although many CAI systems developed previously for NC programming consist of text-based instructions, it is difficult for beginners to learn NC programming with use of them. In the developed CAI system, multi-media texts are adopted for the help of users' understanding, and it is available through Web browser anytime and anywhere. Also the error log is automatically recorded for the future references. According to the NC programming coded by a user, the movement of the NC lathe is animated and shown in the monitor screen in front of the user. If its movement causes the collision between a cutting tool and the lathe, some sound and the caution remark are generated. If the user makes mistakes some times at a certain stage in learning NC, the corresponding suggestion is shown in the form of movies, animations, and so forth. By using the multimedia texts, users' attention is kept concentrated during a training course. In this paper, the configuration of the CAI system is explained and the actual procedures for users to learn the NC programming are also explained too. Some beginners tested this CAI system and their results are illustrated and discussed from the viewpoint of the efficiency and usefulness of this CAI system. A brief conclusion is also mentioned.

  6. CAIs in Semarkona (LL3.0)

    Science.gov (United States)

    Mishra, R. K.; Simon, J. I.; Ross, D. K.; Marhas, K. K.

    2016-01-01

    Calcium, Aluminum-rich inclusions (CAIs) are the first forming solids of the Solar system. Their observed abundance, mean size, and mineralogy vary quite significantly between different groups of chondrites. These differences may reflect the dynamics and distinct cosmochemical conditions present in the region(s) of the protoplanetary disk from which each type likely accreted. Only about 11 such objects have been found in L and LL type while another 57 have been found in H type ordinary chondrites, compared to thousands in carbonaceous chondrites. At issue is whether the rare CAIs contained in ordinary chondrites truly reflect a distinct population from the inclusions commonly found in other chondrite types. Semarkona (LL3.00) (fall, 691 g) is the most pristine chondrite available in our meteorite collection. Here we report petrography and mineralogy of 3 CAIs from Semarkona

  7. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  8. The First Expert CAI System

    Science.gov (United States)

    Feurzeig, Wallace

    1984-01-01

    The first expert instructional system, the Socratic System, was developed in 1964. One of the earliest applications of this system was in the area of differential diagnosis in clinical medicine. The power of the underlying instructional paradigm was demonstrated and the potential of the approach for valuably supplementing medical instruction was recognized. Twenty years later, despite further educationally significant advances in expert systems technology and enormous reductions in the cost of computers, expert instructional methods have found very little application in medical schools.

  9. Computer aided safety analysis

    International Nuclear Information System (INIS)

    1988-05-01

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  10. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  11. E-CAI: a novel server to estimate an expected value of Codon Adaptation Index (eCAI

    Directory of Open Access Journals (Sweden)

    Garcia-Vallvé Santiago

    2008-01-01

    Full Text Available Abstract Background The Codon Adaptation Index (CAI is a measure of the synonymous codon usage bias for a DNA or RNA sequence. It quantifies the similarity between the synonymous codon usage of a gene and the synonymous codon frequency of a reference set. Extreme values in the nucleotide or in the amino acid composition have a large impact on differential preference for synonymous codons. It is thence essential to define the limits for the expected value of CAI on the basis of sequence composition in order to properly interpret the CAI and provide statistical support to CAI analyses. Though several freely available programs calculate the CAI for a given DNA sequence, none of them corrects for compositional biases or provides confidence intervals for CAI values. Results The E-CAI server, available at http://genomes.urv.es/CAIcal/E-CAI, is a web-application that calculates an expected value of CAI for a set of query sequences by generating random sequences with G+C and amino acid content similar to those of the input. An executable file, a tutorial, a Frequently Asked Questions (FAQ section and several examples are also available. To exemplify the use of the E-CAI server, we have analysed the codon adaptation of human mitochondrial genes that codify a subunit of the mitochondrial respiratory chain (excluding those genes that lack a prokaryotic orthologue and are encoded in the nuclear genome. It is assumed that these genes were transferred from the proto-mitochondrial to the nuclear genome and that its codon usage was then ameliorated. Conclusion The E-CAI server provides a direct threshold value for discerning whether the differences in CAI are statistically significant or whether they are merely artifacts that arise from internal biases in the G+C composition and/or amino acid composition of the query sequences.

  12. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    Fujii, M.; Sasaki, K.; Ohi, T.; Itoh, T.

    1993-01-01

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  13. A Model Driven Question-Answering System for a CAI Environment. Final Report (July 1970 to May 1972).

    Science.gov (United States)

    Brown, John S.; And Others

    A question answering system which permits a computer-assisted instruction (CAI) student greater initiative in the variety of questions he can ask is described. A method is presented to represent the dynamic processes of a subject matter area by augmented finite state automata, which permits efficient inferencing about dynamic processes and…

  14. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  15. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  16. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  17. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  18. Computer based training for nuclear operations personnel: From concept to reality

    International Nuclear Information System (INIS)

    Widen, W.C.; Klemm, R.W.

    1986-01-01

    Computer Based Training (CBT) can be subdivided into two categories: Computer Aided Instruction (CAI), or the actual presentation of learning material; and Computer Managed Instruction (CMI), the tracking, recording, and documenting of instruction and student progress. Both CAI and CMI can be attractive to the student and to the training department. A brief overview of CAI and CMI benefits is given in this paper

  19. CAI and training system for the emergency operation procedure in the advanced thermal reactor, FUGEN

    International Nuclear Information System (INIS)

    Kozaki, T.; Imanaga, K.; Nakamura, S.; Maeda, K.; Sakurai, N.; Miyamoto, M.

    2003-01-01

    In the Advanced Thermal Reactor (ATR ) of the JNC, 'FUGEN', a symptom based Emergency Operating Procedure (EOF) was introduced in order to operate Fugen more safely and it became necessary for the plant operators to master the EOF. However it took a lot of time for the instructor to teach the EOP to operators and to train them. Thus, we have developed a Computer Aided Instruction (CAI) and Training System for the EOP, by which the operators can learn the EOP and can be trained. This system has two major functions, i.e., CAI and training. In the CAI function, there are three learning courses, namely, the EOP procedure, the simulation with guidance and Q and A, and the free simulation. In the training function, all of necessary control instruments (indicators, switches, annunciators and so forth) and physics models for the EOP training are simulated so that the trainees can be trained for all of the EOPs. In addition, 50 kinds of malfunction models are installed in order to perform appropriate accident scenarios for the EOP. The training of the EOP covers the range from AOO (Anticipated Operational Occurrence) to Over-DBAs (Design Based Accidents). This system is built in three personal computers that are connected by the computer network. One of the computers is expected to be used for the instructor and the other two are for the trainees. The EOP is composed of eight guidelines, such as 'Reactor Control' and 'Depression and Cooling', and the operation screens which are corresponded to the guidelines are respectively provided. According to the trial, we have estimated that the efficiency of the learning and the training would be improved about 30% for the trainee and about 75% for the instructor in the actual learning and training. (author)

  20. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  1. Sexual life and sexual wellness in individuals with complete androgen insensitivity syndrome (CAIS) and Mayer-Rokitansky-Küster-Hauser Syndrome (MRKHS).

    Science.gov (United States)

    Fliegner, Maike; Krupp, Kerstin; Brunner, Franziska; Rall, Katharina; Brucker, Sara Y; Briken, Peer; Richter-Appelt, Hertha

    2014-03-01

    Sexual wellness depends on a person's physical and psychological constitution. Complete Androgen Insensitivity Syndrome (CAIS) and Mayer-Rokitansky-Küster-Hauser Syndrome (MRKHS) can compromise sexual well-being. To compare sexual well-being in CAIS and MRKHS using multiple measures: To assess sexual problems and perceived distress. To gain insight into participants' feelings of inadequacy in social and sexual situations, level of self-esteem and depression. To determine how these psychological factors relate to sexual (dys)function. To uncover what participants see as the source of their sexual problems. Data were collected using a paper-and-pencil questionnaire. Eleven individuals with CAIS and 49 with MRKHS with/without neovagina treatment were included. Rates of sexual dysfunctions, overall sexual function, feelings of inadequacy in social and sexual situations, self-esteem and depression scores were calculated. Categorizations were used to identify critical cases. Correlations between psychological variables and sexual function were computed. Sexually active subjects were compared with sexually not active participants. A qualitative content analysis was carried out to explore causes of sexual problems. An extended list of sexual problems based on the Diagnostic and Statistical Manual of Mental Disorders, 4th ed., text revision, by the American Psychiatric Association and related distress. Female Sexual Function Index (FSFI), German Questionnaire on Feelings of Inadequacy in Social and Sexual Situations (FUSS social scale, FUSS sexual scale), Rosenberg Self-Esteem Scale (RSE), Brief Symptom Inventory (BSI) subscale depression. Open question on alleged causes of sexual problems. The results point to a far-reaching lack of sexual confidence and sexual satisfaction in CAIS. In MRKHS apprehension in sexual situations is a source of distress, but sexual problems seem to be more focused on issues of vaginal functioning. MRKHS women report being satisfied with their

  2. Adaptation of an aerosol retrieval algorithm using multi-wavelength and multi-pixel information of satellites (MWPM) to GOSAT/TANSO-CAI

    Science.gov (United States)

    Hashimoto, M.; Takenaka, H.; Higurashi, A.; Nakajima, T.

    2017-12-01

    Aerosol in the atmosphere is an important constituent for determining the earth's radiation budget, so the accurate aerosol retrievals from satellite is useful. We have developed a satellite remote sensing algorithm to retrieve the aerosol optical properties using multi-wavelength and multi-pixel information of satellite imagers (MWPM). The method simultaneously derives aerosol optical properties, such as aerosol optical thickness (AOT), single scattering albedo (SSA) and aerosol size information, by using spatial difference of wavelegths (multi-wavelength) and surface reflectances (multi-pixel). The method is useful for aerosol retrieval over spatially heterogeneous surface like an urban region. In this algorithm, the inversion method is a combination of an optimal method and smoothing constraint for the state vector. Furthermore, this method has been combined with the direct radiation transfer calculation (RTM) numerically solved by each iteration step of the non-linear inverse problem, without using look up table (LUT) with several constraints. However, it takes too much computation time. To accelerate the calculation time, we replaced the RTM with an accelerated RTM solver learned by neural network-based method, EXAM (Takenaka et al., 2011), using Rster code. And then, the calculation time was shorternd to about one thouthandth. We applyed MWPM combined with EXAM to GOSAT/TANSO-CAI (Cloud and Aerosol Imager). CAI is a supplement sensor of TANSO-FTS, dedicated to measure cloud and aerosol properties. CAI has four bands, 380, 674, 870 and 1600 nm, and observes in 500 meters resolution for band1, band2 and band3, and 1.5 km for band4. Retrieved parameters are aerosol optical properties, such as aerosol optical thickness (AOT) of fine and coarse mode particles at a wavelenth of 500nm, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength by combining a minimum reflectance method and Fukuda et al. (2013). We will show

  3. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  4. Particulated articular cartilage: CAIS and DeNovo NT.

    Science.gov (United States)

    Farr, Jack; Cole, Brian J; Sherman, Seth; Karas, Vasili

    2012-03-01

    Cartilage Autograft Implantation System (CAIS; DePuy/Mitek, Raynham, MA) and DeNovo Natural Tissue (NT; ISTO, St. Louis, MO) are novel treatment options for focal articular cartilage defects in the knee. These methods involve the implantation of particulated articular cartilage from either autograft or juvenile allograft donor, respectively. In the laboratory and in animal models, both CAIS and DeNovo NT have demonstrated the ability of the transplanted cartilage cells to "escape" from the extracellular matrix, migrate, multiply, and form a new hyaline-like cartilage tissue matrix that integrates with the surrounding host tissue. In clinical practice, the technique for both CAIS and DeNovo NT is straightforward, requiring only a single surgery to affect cartilage repair. Clinical experience is limited, with short-term studies demonstrating both procedures to be safe, feasible, and effective, with improvements in subjective patient scores, and with magnetic resonance imaging evidence of good defect fill. While these treatment options appear promising, prospective randomized controlled studies are necessary to refine the indications and contraindications for both CAIS and DeNovo NT.

  5. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    1990-04-01

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  6. STAF: A Powerful and Sophisticated CAI System.

    Science.gov (United States)

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  7. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  8. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  9. Computational analysis of cerebral cortex

    International Nuclear Information System (INIS)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni

    2010-01-01

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  10. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  11. Effectiveness of Using Computer-Assisted Supplementary Instruction for Teaching the Mole Concept

    Science.gov (United States)

    Yalçinalp, Serpil; Geban, Ömer; Özkan, Ilker

    This study examined the effect of computer-assisted instruction (CAI), used as a problem-solving supplement to classroom instruction, on students' understanding of chemical formulas and mole concept, their attitudes toward chemistry subjects, and CAI. The objective was to assess the effectiveness of CAI over recitation hours when both teaching methods were used as a supplement to the traditional chemistry instruction. We randomly selected two classes in a secondary school. Each teaching strategy was randomly assigned to one class. The experimental group received supplementary instruction delivered via CAI, while the control group received similar instruction through recitation hours. The data were analyzed using two-way analysis of variance and t-test. It was found that the students who used the CAI accompanied with lectures scored significantly higher than those who attended recitation hours, in terms of school subject achievement in chemistry and attitudes toward chemistry subjects. In addition, there was a significant improvement in the attitudes of students in the experimental group toward the use of computers in a chemistry course. There was no significant difference between the performances of females versus males in each treatment group.Received: 26 April 1994; Revised: 6 April 1995;

  12. An Evaluation of the Cognitive and Affective Performance of an Integrated Set of CAI Materials in the Principles of Macroeconomics. Studies in Economic Education, No. 4.

    Science.gov (United States)

    Daellenbach, Lawrence A.; And Others

    The purpose of this study was to determine the effect of computer assisted instruction (CAI) on the cognitive and affective development of college students enrolled in a principles of macroeconomics course. The hypotheses of the experiment were stated as follows: In relation to the traditional principles course, the experimental treatment will…

  13. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  14. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  15. Oxygen isotope variations at the margin of a CAI records circulation within the solar nebula.

    Science.gov (United States)

    Simon, Justin I; Hutcheon, Ian D; Simon, Steven B; Matzel, Jennifer E P; Ramon, Erick C; Weber, Peter K; Grossman, Lawrence; DePaolo, Donald J

    2011-03-04

    Micrometer-scale analyses of a calcium-, aluminum-rich inclusion (CAI) and the characteristic mineral bands mantling the CAI reveal that the outer parts of this primitive object have a large range of oxygen isotope compositions. The variations are systematic; the relative abundance of (16)O first decreases toward the CAI margin, approaching a planetary-like isotopic composition, then shifts to extremely (16)O-rich compositions through the surrounding rim. The variability implies that CAIs probably formed from several oxygen reservoirs. The observations support early and short-lived fluctuations of the environment in which CAIs formed, either because of transport of the CAIs themselves to distinct regions of the solar nebula or because of varying gas composition near the proto-Sun.

  16. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    Science.gov (United States)

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  17. An experimental study of fuel injection strategies in CAI gasoline engine

    Energy Technology Data Exchange (ETDEWEB)

    Hunicz, J.; Kordos, P. [Department of Combustion Engines and Transport, Lublin University of Technology, Nadbystrzycka 36, 20-618 Lublin (Poland)

    2011-01-15

    Combustion of gasoline in a direct injection controlled auto-ignition (CAI) single-cylinder research engine was studied. CAI operation was achieved with the use of the negative valve overlap (NVO) technique and internal exhaust gas re-circulation (EGR). Experiments were performed at single injection and split injection, where some amount of fuel was injected close to top dead centre (TDC) during NVO interval, and the second injection was applied with variable timing. Additionally, combustion at variable fuel-rail pressure was examined. Investigation showed that at fuel injection into recompressed exhaust fuel reforming took place. This process was identified via an analysis of the exhaust-fuel mixture composition after NVO interval. It was found that at single fuel injection in NVO phase, its advance determined the heat release rate and auto-ignition timing, and had a strong influence on NO{sub X} emission. However, a delay of single injection to intake stroke resulted in deterioration of cycle-to-cycle variability. Application of split injection showed benefits of this strategy versus single injection. Examinations of different fuel mass split ratios and variable second injection timing resulted in further optimisation of mixture formation. At equal share of the fuel mass injected in the first injection during NVO and in the second injection at the beginning of compression, the lowest emission level and cyclic variability improvement were observed. (author)

  18. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  19. Computer Assisted Instruction

    Science.gov (United States)

    Higgins, Paul

    1976-01-01

    Methodology for developing a computer assisted instruction (CAI) lesson (scripting, programing, and testing) is reviewed. A project done by Informatics Education Ltd. (IEL) for the Department of National Defense (DND) is used as an example. (JT)

  20. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  1. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  2. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  3. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  4. Numerical Investigation Into Effect of Fuel Injection Timing on CAI/HCCI Combustion in a Four-Stroke GDI Engine

    Science.gov (United States)

    Cao, Li; Zhao, Hua; Jiang, Xi; Kalian, Navin

    2006-02-01

    The Controlled Auto-Ignition (CAI) combustion, also known as Homogeneous Charge Compression Ignition (HCCI), was achieved by trapping residuals with early exhaust valve closure in conjunction with direct injection. Multi-cycle 3D engine simulations have been carried out for parametric study on four different injection timings in order to better understand the effects of injection timings on in-cylinder mixing and CAI combustion. The full engine cycle simulation including complete gas exchange and combustion processes was carried out over several cycles in order to obtain the stable cycle for analysis. The combustion models used in the present study are the Shell auto-ignition model and the characteristic-time combustion model, which were modified to take the high level of EGR into consideration. A liquid sheet breakup spray model was used for the droplet breakup processes. The analyses show that the injection timing plays an important role in affecting the in-cylinder air/fuel mixing and mixture temperature, which in turn affects the CAI combustion and engine performance.

  5. The Impact of Different Support Vectors on GOSAT-2 CAI-2 L2 Cloud Discrimination

    Directory of Open Access Journals (Sweden)

    Yu Oishi

    2017-11-01

    Full Text Available Greenhouse gases Observing SATellite-2 (GOSAT-2 will be launched in fiscal year 2018. GOSAT-2 will be equipped with two sensors: the Thermal and Near-infrared Sensor for Carbon Observation (TANSO-Fourier Transform Spectrometer 2 (FTS-2 and the TANSO-Cloud and Aerosol Imager 2 (CAI-2. CAI-2 is a push-broom imaging sensor that has forward- and backward-looking bands to observe the optical properties of aerosols and clouds and to monitor the status of urban air pollution and transboundary air pollution over oceans, such as PM2.5 (particles less than 2.5 micrometers in diameter. CAI-2 has important applications for cloud discrimination in each direction. The Cloud and Aerosol Unbiased Decision Intellectual Algorithm (CLAUDIA1, which applies sequential threshold tests to features is used for GOSAT CAI L2 cloud flag processing. If CLAUDIA1 is used with CAI-2, it is necessary to optimize the thresholds in accordance with CAI-2. However, CLAUDIA3 with support vector machines (SVM, a supervised pattern recognition method, was developed, and then we applied CLAUDIA3 for GOSAT-2 CAI-2 L2 cloud discrimination processing. Thus, CLAUDIA3 can automatically find the optimized boundary between clear and cloudy areas. Improvements in CLAUDIA3 using CAI (CLAUDIA3-CAI continue to be made. In this study, we examined the impact of various support vectors (SV on GOSAT-2 CAI-2 L2 cloud discrimination by analyzing (1 the impact of the choice of different time periods for the training data and (2 the impact of different generation procedures for SV on the cloud discrimination efficiency. To generate SV for CLAUDIA3-CAI from MODIS data, there are two times at which features are extracted, corresponding to CAI bands. One procedure is equivalent to generating SV using CAI data. Another procedure generates SV for MODIS cloud discrimination at the beginning, and then extracts decision function, thresholds, and SV corresponding to CAI bands. Our results indicated the following

  6. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.

  7. Structural basis of Na+-independent and cooperative substrate/product antiport in CaiT

    NARCIS (Netherlands)

    Schulze, Sabrina; Köster, Stefan; Geldmacher, Ulrike; Terwisscha van Scheltinga, Anke C.; Kühlbrandt, Werner

    2010-01-01

    Transport of solutes across biological membranes is performed by specialized secondary transport proteins in the lipid bilayer, and is essential for life. Here we report the structures of the sodium-independent carnitine/butyrobetaine antiporter CaiT from Proteus mirabilis (PmCaiT) at 2.3-Å and from

  8. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  9. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  10. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  11. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  12. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  13. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  14. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  15. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  16. DFT computational analysis of piracetam

    Science.gov (United States)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  17. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  18. Computer-Assisted Instruction: Authoring Languages. ERIC Digest.

    Science.gov (United States)

    Reeves, Thomas C.

    One of the most perplexing tasks in producing computer-assisted instruction (CAI) is the authoring process. Authoring is generally defined as the process of turning the flowcharts, control algorithms, format sheets, and other documentation of a CAI program's design into computer code that will operationalize the simulation on the delivery system.…

  19. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  20. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  1. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  2. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  3. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  4. Temporal fringe pattern analysis with parallel computing

    International Nuclear Information System (INIS)

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-01-01

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis

  5. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  6. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    Directory of Open Access Journals (Sweden)

    V. V. Postnov

    2014-07-01

    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  7. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  8. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  9. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  10. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  11. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  12. Computational analysis of a multistage axial compressor

    Science.gov (United States)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  13. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  14. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  15. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  16. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  17. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  18. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  19. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  20. A multielement isotopic study of refractory FUN and F CAIs: Mass-dependent and mass-independent isotope effects

    Science.gov (United States)

    Kööp, Levke; Nakashima, Daisuke; Heck, Philipp R.; Kita, Noriko T.; Tenner, Travis J.; Krot, Alexander N.; Nagashima, Kazuhide; Park, Changkun; Davis, Andrew M.

    2018-01-01

    Calcium-aluminum-rich inclusions (CAIs) are the oldest dated objects that formed inside the Solar System. Among these are rare, enigmatic objects with large mass-dependent fractionation effects (F CAIs), which sometimes also have large nucleosynthetic anomalies and a low initial abundance of the short-lived radionuclide 26Al (FUN CAIs). We have studied seven refractory hibonite-rich CAIs and one grossite-rich CAI from the Murchison (CM2) meteorite for their oxygen, calcium, and titanium isotopic compositions. The 26Al-26Mg system was also studied in seven of these CAIs. We found mass-dependent heavy isotope enrichment in all measured elements, but never simultaneously in the same CAI. The data are hard to reconcile with a single-stage melt evaporation origin and may require reintroduction or reequilibration for magnesium, oxygen and titanium after evaporation for some of the studied CAIs. The initial 26Al/27Al ratios inferred from model isochrons span a range from <1 × 10-6 to canonical (∼5 × 10-5). The CAIs show a mutual exclusivity relationship between inferred incorporation of live 26Al and the presence of resolvable anomalies in 48Ca and 50Ti. Furthermore, a relationship exists between 26Al incorporation and Δ17O in the hibonite-rich CAIs (i.e., 26Al-free CAIs have resolved variations in Δ17O, while CAIs with resolved 26Mg excesses have Δ17O values close to -23‰). Only the grossite-rich CAI has a relatively enhanced Δ17O value (∼-17‰) in spite of a near-canonical 26Al/27Al. We interpret these data as indicating that fractionated hibonite-rich CAIs formed over an extended time period and sampled multiple stages in the isotopic evolution of the solar nebula, including: (1) an 26Al-poor nebula with large positive and negative anomalies in 48Ca and 50Ti and variable Δ17O; (2) a stage of 26Al-admixture, during which anomalies in 48Ca and 50Ti had been largely diluted and a Δ17O value of ∼-23‰ had been achieved in the CAI formation region; and (3

  1. Computer-Assisted Mathematics Instruction for Students with Specific Learning Disability: A Review of the Literature

    Science.gov (United States)

    Stultz, Sherry L.

    2017-01-01

    This review was conducted to evaluate the current body of scholarly research regarding the use of computer-assisted instruction (CAI) to teach mathematics to students with specific learning disability (SLD). For many years, computers are utilized for educational purposes. However, the effectiveness of CAI for teaching mathematics to this specific…

  2. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  3. Aerodynamic analysis of Pegasus - Computations vs reality

    Science.gov (United States)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  4. The enhancement of students’ mathematical representation in junior high school using cognitive apprenticeship instruction (CAI)

    Science.gov (United States)

    Yusepa, B. G. P.; Kusumah, Y. S.; Kartasasmita, B. G.

    2018-03-01

    This study aims to get an in-depth understanding of the enhancement of students’ mathematical representation. This study is experimental research with pretest-posttest control group design. The subject of this study is the students’ of the eighth grade from junior high schools in Bandung: high-level and middle-level. In each school, two parallel groups were chosen as a control group and an experimental group. The experimental group was given cognitive apprenticeship instruction (CAI) treatment while the control group was given conventional learning. The results show that the enhancement of students’ mathematical representation who obtained CAI treatment was better than the conventional one, viewed which can be observed from the overall, mathematical prior knowledge (MPK), and school level. It can be concluded that CAI can be used as a good alternative learning model to enhance students’ mathematical representation.

  5. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  6. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  7. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  8. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  9. Thermal and chemical evolution in the early solar system as recorded by FUN CAIs: Part I - Petrology, mineral chemistry, and isotopic composition of Allende FUN CAI CMS-1

    Science.gov (United States)

    Williams, C. D.; Ushikubo, T.; Bullock, E. S.; Janney, P. E.; Hines, R. R.; Kita, N. T.; Hervig, R. L.; MacPherson, G. J.; Mendybaev, R. A.; Richter, F. M.; Wadhwa, M.

    2017-03-01

    Detailed petrologic, geochemical and isotopic analyses of a new FUN CAI from the Allende CV3 meteorite (designated CMS-1) indicate that it formed by extensive melting and evaporation of primitive precursor material(s). The precursor material(s) condensed in a 16O-rich region (δ17O and δ18O ∼ -49‰) of the inner solar nebula dominated by gas of solar composition at total pressures of ∼10-3-10-6 bar. Subsequent melting of the precursor material(s) was accompanied by evaporative loss of magnesium, silicon and oxygen resulting in large mass-dependent isotope fractionations in these elements (δ25Mg = 30.71-39.26‰, δ29Si = 14.98-16.65‰, and δ18O = -41.57 to -15.50‰). This evaporative loss resulted in a bulk composition similar to that of compact Type A and Type B CAIs, but very distinct from the composition of the original precursor condensate(s). Kinetic fractionation factors and the measured mass-dependent fractionation of silicon and magnesium in CMS-1 suggest that ∼80% of the silicon and ∼85% of the magnesium were lost from its precursor material(s) through evaporative processes. These results suggest that the precursor material(s) of normal and FUN CAIs condensed in similar environments, but subsequently evolved under vastly different conditions such as total gas pressure. The chemical and isotopic differences between normal and FUN CAIs could be explained by sorting of early solar system materials into distinct physical and chemical regimes, in conjunction with discrete heating events, within the protoplanetary disk.

  10. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  11. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  12. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  13. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Science.gov (United States)

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  14. Gender Role, Gender Identity and Sexual Orientation in CAIS ("XY-Women") Compared With Subfertile and Infertile 46,XX Women.

    Science.gov (United States)

    Brunner, Franziska; Fliegner, Maike; Krupp, Kerstin; Rall, Katharina; Brucker, Sara; Richter-Appelt, Hertha

    2016-01-01

    The perception of gender development of individuals with complete androgen insensitivity syndrome (CAIS) as unambiguously female has recently been challenged in both qualitative data and case reports of male gender identity. The aim of the mixed-method study presented was to examine the self-perception of CAIS individuals regarding different aspects of gender and to identify commonalities and differences in comparison with subfertile and infertile XX-chromosomal women with diagnoses of Mayer-Rokitansky-Küster-Hauser syndrome (MRKHS) and polycystic ovary syndrome (PCOS). The study sample comprised 11 participants with CAIS, 49 with MRKHS, and 55 with PCOS. Gender identity was assessed by means of a multidimensional instrument, which showed significant differences between the CAIS group and the XX-chromosomal women. Other-than-female gender roles and neither-female-nor-male sexes/genders were reported only by individuals with CAIS. The percentage with a not exclusively androphile sexual orientation was unexceptionally high in the CAIS group compared to the prevalence in "normative" women and the clinical groups. The findings support the assumption made by Meyer-Bahlburg ( 2010 ) that gender outcome in people with CAIS is more variable than generally stated. Parents and professionals should thus be open to courses of gender development other than typically female in individuals with CAIS.

  15. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  16. The Use of Modular Computer-Based Lessons in a Modification of the Classical Introductory Course in Organic Chemistry.

    Science.gov (United States)

    Stotter, Philip L.; Culp, George H.

    An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…

  17. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    Science.gov (United States)

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  18. Cognitive Assessment Interview (CAI): Validity as a co-primary measure of cognition across phases of schizophrenia.

    Science.gov (United States)

    Ventura, Joseph; Subotnik, Kenneth L; Ered, Arielle; Hellemann, Gerhard S; Nuechterlein, Keith H

    2016-04-01

    Progress has been made in developing interview-based measures for the assessment of cognitive functioning, such as the Cognitive Assessment Interview (CAI), as co-primary measures that compliment objective neurocognitive assessments and daily functioning. However, a few questions remain, including whether the relationships with objective cognitive measures and daily functioning are high enough to justify the CAI as an co-primary measure and whether patient-only assessments are valid. Participants were first-episode schizophrenia patients (n=60) and demographically-similar healthy controls (n=35), chronic schizophrenia patients (n=38) and demographically similar healthy controls (n=19). Participants were assessed at baseline with an interview-based measure of cognitive functioning (CAI), a test of objective cognitive functioning, functional capacity, and role functioning at baseline, and in the first episode patients again 6 months later (n=28). CAI ratings were correlated with objective cognitive functioning, functional capacity, and functional outcomes in first-episode schizophrenia patients at similar magnitudes as in chronic patients. Comparisons of first-episode and chronic patients with healthy controls indicated that the CAI sensitively detected deficits in schizophrenia. The relationship of CAI Patient-Only ratings with objective cognitive functioning, functional capacity, and daily functioning were comparable to CAI Rater scores that included informant information. These results confirm in an independent sample the relationship of the CAI ratings with objectively measured cognition, functional capacity, and role functioning. Comparison of schizophrenia patients with healthy controls further validates the CAI as an co-primary measure of cognitive deficits. Also, CAI change scores were strongly related to objective cognitive change indicating sensitivity to change. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  20. Consumption of fa cai Nostoc soup: a potential for BMAA exposure from Nostoc cyanobacteria in China?

    Science.gov (United States)

    Roney, Britton R; Renhui, Li; Banack, Sandra Anne; Murch, Susan; Honegger, Rosmarie; Cox, Paul Alan

    2009-01-01

    Grown in arid regions of western China the cyanobacterium Nostoc flagelliforme--called fa cai in Mandarin and fat choy in Cantonese--is wild-harvested and used to make soup consumed during New Year's celebrations. High prices, up to $125 USD/kg, led to overharvesting in Inner Mongolia, Ningxia, Gansu, Qinghai, and Xinjiang. Degradation of arid ecosystems, desertification, and conflicts between Nostoc harvesters and Mongol herdsmen concerned the Chinese environmental authorities, leading to a government ban of Nostoc commerce. This ban stimulated increased marketing of a substitute made from starch. We analysed samples purchased throughout China as well as in Chinese markets in the United States and the United Kingdom. Some were counterfeits consisting of dyed starch noodles. A few samples from California contained Nostoc flagelliforme but were adulterated with starch noodles. Other samples, including those from the United Kingdom, consisted of pure Nostoc flagelliforme. A recent survey of markets in Cheng Du showed no real Nostoc flagelliforme to be marketed. Real and artificial fa cai differ in the presence of beta-N-methylamino-L-alanine (BMAA). Given its status as a high-priced luxury food, the government ban on collection and marketing, and the replacement of real fa cai with starch substitutes consumed only on special occasions, it is anticipated that dietary exposure to BMAA from fa cai will be reduced in the future in China.

  1. CAIS/ACSI 2001: Beyond the Web: Technologies, Knowledge and People.

    Science.gov (United States)

    Canadian Journal of Information and Library Science, 2000

    2000-01-01

    Presents abstracts of papers presented at the 29th Annual Conference of the Canadian Association for Information Science (CAIS) held in Quebec on May 27-29, 2001. Topics include: professional development; librarian/library roles; information technology uses; virtual libraries; information seeking behavior; literacy; information retrieval;…

  2. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    DEFF Research Database (Denmark)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.

    2014-01-01

    We present a detailed characterization of the mineralogy, petrology, and oxygen isotopic compositions of twelve FUN CAIs, including C1 and EK1-4-1 from Allende (CV), that were previously shown to have large isotopic fractionation patterns for magnesium and oxygen, and large isotopic anomalies...

  3. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  4. An ion microprobe study of CAIs from CO3 meteorites. [Abstract only

    Science.gov (United States)

    Russell, S. S.; Greenwood, R. C.; Fahey, A. J.; Huss, G. R.; Wasserburg, G. J.

    1994-01-01

    When attempting to interpret the history of Ca, Al-rich inclusions (CAIs) it is often difficult to distinguish between primary features inherited from the nebula and those produced during secondary processing on the parent body. We have undertaken a systematic study of CAIs from 10 CO chondrites, believed to represent a metamorphic sequence with the goal of distinguishing primary and secondary features. ALHA 77307 (3.0), Colony (3.0), Kainsaz (3.1), Felix (3.2), ALH 82101 (3.3), Ornans (3.3), Lance (3.4), ALHA 77003 (3.5), Warrenton (3.6), and Isna (3.7) were examined by Scanning Electron Microscopy (SEM) and optical microscopy. We have identified 141 CAIs within these samples, and studied in detail the petrology of 34 inclusions. The primary phases in the lower petrologic types are spinel, melilite, and hibonite. Perovskite, FeS, ilmenite, anorthite, kirschsteinite, and metallic Fe are present as minor phases. Melilite becomes less abundant in higher petrologic types and was not detected in chondrites of type 3.5 and above, confirming previous reports that this mineral easily breaks down during heating. Iron, an element that would not be expected to condense at high temperatures, has a lower abundance in spinel from low-petrologic-type meteorites than those of higher grade, and CaTiO3 is replaced by FeTiO3 in meteorites of higher petrologic type. The abundance of CAIs is similar in each meteorite. Eight inclusions have been analyzed by ion probe. The results are summarized. The results obtained to date show that CAIs in CO meteorites, like those from other meteorite classes, contain Mg* and that Mg in some inclusions has been redistributed.

  5. Analysis on the security of cloud computing

    Science.gov (United States)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  6. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  7. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  8. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  9. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  10. Computer-Assisted Instruction: A Case Study of Two Charter Schools

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2013-01-01

    The purpose of this study was to examine the relationship in achievement gap between English language learners (ELLs) utilizing computer-assisted instruction (CAI) in the classroom, and ELLs relying solely on traditional classroom instruction. The study findings showed that students using CAI to supplement traditional lectures performed better…

  11. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Science.gov (United States)

    Garrett, T. J.; And Others

    1987-01-01

    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  12. An Evaluation of Computer-Aided Instruction in an Introductory Biostatistics Course.

    Science.gov (United States)

    Forsythe, Alan B.; Freed, James R.

    1979-01-01

    Evaluates the effectiveness of computer assisted instruction for teaching biostatistics to first year students at the UCLA School of Dentistry. Results do not demonstrate the superiority of CAI but do suggest that CAI compares favorably to conventional lecture and programed instruction methods. (RAO)

  13. Computer-Assisted Instruction to Teach DOS Commands: A Pilot Study.

    Science.gov (United States)

    McWeeney, Mark G.

    1992-01-01

    Describes a computer-assisted instruction (CAI) program used to teach DOS commands. Pretest and posttest results for 65 graduate students using the program are reported, and it is concluded that the CAI program significantly aided the students. Sample screen displays for the program and several questions from the pre/posttest are included. (nine…

  14. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  15. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  16. Northwest Africa 10758: A New CV3 Chondrite Bearing a Giant CAI with Hibonite-Rich Wark-Lovering Rim

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Zolensky, M.

    2017-01-01

    Northwest Africa (NWA) 10758 is a newly identified carbonaceous chondrite that is a Bali-like oxidized CV3. The large Ca-Al rich inclusion (CAI) in this sample is approx. 2.4 x 1.4 cm. The CAI is transitional in composition between type A and type B, with interior mineralogy dominated by melilite, plus less abundant spinel and Al-Ti rich diopside, and only very minor anorthite (Fig. 1A). This CAI is largely free of secondary alteration in the exposed section we examined, with almost no nepheline, sodalite or Ca-Fe silicates. The Wark-Lovering (WL) rim on this CAI is dominated by hibonite, with lower abundances of spinel and perovskite, and with hibonite locally overlain by melilite plus perovskite (as in Fig. 1B). Note that the example shown in 1B is exceptional. Around most of the CAI, hibonite + spinel + perovskite form the WL rim, without overlying melilite. The WL rim can be unusually thick, ranging from approx.20 microns up to approx. 150 microns. A well-developed, stratified accretionary rim infills embayments of the CAI, and thins over protuberances in the convoluted CAI surface.

  17. NWA10758: A New CV3 Chondrite Bearing a Giant CAI with Hibonite-Rich Wark-Lovering Rim

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Zolensky, M.

    2017-01-01

    Northwest Africa (NWA) 10758 is a newly identified carbonaceous chondrite that is a Bali-like oxidized CV3. The large Ca-Al rich inclusion (CAI) in this sample is approx. 2.4 x 1.4 cm. The CAI is transitional in composition between type A and type B, with interior mineralogy dominated by melilite, plus less abundant spinel and Al-Ti rich diopside, and only very minor anorthite (Fig. 1A). This CAI is largely free of secondary alteration in the exposed section we examined, with almost no nepheline, sodalite or Ca-Fe silicates. The Wark-Lovering (WL) rim on this CAI is dominated by hibonite, with lower abundances of spinel and perovskite, and with hibonite locally overlain by melilite plus perovskite (as in Fig. 1B). Note that the example shown in 1B is exceptional. Around most of the CAI, hibonite + spinel + perovskite form the WL rim, without overlying melilite. The WL rim can be unusually thick, ranging from approx. 20 microns up to approx. 150 microns. A well-developed, stratified accretionary rim infills embayments of the CAI, and thins over protuberances in the convoluted CAI surface.

  18. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  19. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  20. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  1. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    2008-06-01

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  2. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  3. Multiple Nebular Gas Reservoirs Recorded by Oxygen Isotope Variation in a Spinel-rich CAI in CO3 MIL 090019

    Science.gov (United States)

    Simon, J. I.; Simon, S. B.; Nguyen, A. N.; Ross, D. K.; Messenger, S.

    2017-01-01

    We conducted NanoSIMS O-isotopic imaging of a primitive spinel-rich CAI spherule (27-2) from the MIL 090019 CO3 chondrite. Inclusions such as 27-2 are proposed to record inner nebula processes during an epoch of rapid solar nebula evolution. Mineralogical and textural analyses suggest that this CAI formed by high temperature reactions, partial melting, and condensation. This CAI exhibits radial O-isotopic heterogeneity among multiple occurrences of the same mineral, reflecting interactions with distinct nebular O-isotopic reservoirs.

  4. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  5. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  6. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  7. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  8. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  9. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  10. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  11. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  12. Bird community structure in riparian environments in Cai River, Rio Grande do Sul, Brazil

    OpenAIRE

    Jaqueline Brummelhaus; Marcia Suelí Bohn; Maria Virginia Petry

    2012-01-01

    Urbanization produces changes in riparian environments, causing effects in the structure of bird communities, which present different responses to the impacts. We compare species richness, abundance, and composition of birds in riparian environments with different characteristics in Cai River, Rio Grande do Sul, Brazil. We carried out observations in woodland, grassland, and urban environments, between September 2007 and August 2008. We listed 130 bird species, 29 species unique to woodland e...

  13. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  14. Numerical investigation of CAI Combustion in the Opposed- Piston Engine with Direct and Indirect Water Injection

    Science.gov (United States)

    Pyszczek, R.; Mazuro, P.; Teodorczyk, A.

    2016-09-01

    This paper is focused on the CAI combustion control in a turbocharged 2-stroke Opposed-Piston (OP) engine. The barrel type OP engine arrangement is of particular interest for the authors because of its robust design, high mechanical efficiency and relatively easy incorporation of a Variable Compression Ratio (VCR). The other advantage of such design is that combustion chamber is formed between two moving pistons - there is no additional cylinder head to be cooled which directly results in an increased thermal efficiency. Furthermore, engine operation in a Controlled Auto-Ignition (CAI) mode at high compression ratios (CR) raises a possibility of reaching even higher efficiencies and very low emissions. In order to control CAI combustion such measures as VCR and water injection were considered for indirect ignition timing control. Numerical simulations of the scavenging and combustion processes were performed with the 3D CFD multipurpose AVL Fire solver. Numerous cases were calculated with different engine compression ratios and different amounts of directly and indirectly injected water. The influence of the VCR and water injection on the ignition timing and engine performance was determined and their application in the real engine was discussed.

  15. Analysis On Security Of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Muhammad Zunnurain Hussain

    2017-01-01

    Full Text Available In this paper Author will be discussing the security issues and challenges faced by the industry in securing the cloud computing and how these problems can be tackled. Cloud computing is modern technique of sharing resources like data sharing file sharing basically sharing of resources without launching own infrastructure and using some third party resources to avoid huge investment . It is very challenging these days to secure the communication between two users although people use different encryption techniques 1.

  16. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  17. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  18. The Impact of Computer Assisted Instruction As It Relates to Learning Disabled Adults in California Community Colleges.

    Science.gov (United States)

    Brower, Mary Jo

    A study was conducted to determine the advantages and disadvantages of using computer-assisted instruction (CAI) with learning disabled (LD) adults attending California community colleges. A questionnaire survey of the directors of the LD programs solicited information on the availability of CAI for LD adults, methods of course advertisement,…

  19. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van

  20. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  1. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  2. Computational Analysis of SAXS Data Acquisition.

    Science.gov (United States)

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.

  3. Interactive Computer Lessons for Introductory Economics: Guided Inquiry-From Supply and Demand to Women in the Economy.

    Science.gov (United States)

    Miller, John; Weil, Gordon

    1986-01-01

    The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)

  4. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  5. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...

  6. Computer programs simplify optical system analysis

    Science.gov (United States)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  7. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...

  8. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  9. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  10. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  11. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  12. Computational analysis of ozonation in bubble columns

    International Nuclear Information System (INIS)

    Quinones-Bolanos, E.; Zhou, H.; Otten, L.

    2002-01-01

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  13. OXYGEN ISOTOPIC COMPOSITIONS OF THE ALLENDE TYPE C CAIs: EVIDENCE FOR ISOTOPIC EXCHANGE DURING NEBULAR MELTING AND ASTEROIDAL THERMAL METAMORPHISM

    Energy Technology Data Exchange (ETDEWEB)

    Krot, A N; Chaussidon, M; Yurimoto, H; Sakamoto, N; Nagashima, K; Hutcheon, I D; MacPherson, G J

    2008-02-21

    Based on the mineralogy and petrography, coarse-grained, igneous, anorthite-rich (Type C) calcium-aluminum-rich inclusions (CAIs) in the CV3 carbonaceous chondrite Allende have been recently divided into three groups: (i) CAIs with melilite and Al,Ti-diopside of massive and lacy textures (coarse grains with numerous rounded inclusions of anorthite) in a fine-grained anorthite groundmass (6-1-72, 100, 160), (ii) CAI CG5 with massive melilite, Al,Ti-diopside and anorthite, and (iii) CAIs associated with chondrule material: either containing chondrule fragments in their peripheries (ABC, TS26) or surrounded by chondrule-like, igneous rims (93) (Krot et al., 2007a,b). Here, we report in situ oxygen isotopic measurements of primary (melilite, spinel, Al,Ti-diopside, anorthite) and secondary (grossular, monticellite, forsterite) minerals in these CAIs. Spinel ({Delta}{sup 17}O = -25{per_thousand} to -20{per_thousand}), massive and lacy Al,Ti-diopside ({Delta}{sup 17}O = -20{per_thousand} to -5{per_thousand}) and fine-grained anorthite ({Delta}{sup 17}O = -15{per_thousand} to -2{per_thousand}) in 100, 160 and 6-1-72 are {sup 16}O-enriched relative spinel and coarse-grained Al,Ti-diopside and anorthite in ABC, 93 and TS26 ({Delta}{sup 17}O ranges from -20{per_thousand} to -15{per_thousand}, from -15{per_thousand} to -5{per_thousand}, and from -5{per_thousand} to 0{per_thousand}, respectively). In 6-1-72, massive and lacy Al,Ti-diopside grains are {sup 16}O-depleted ({Delta}{sup 17}O {approx} -13{per_thousand}) relative to spinel ({Delta}{sup 17}O = -23{per_thousand}). Melilite is the most {sup 16}O-depleted mineral in all Allende Type C CAIs. In CAI 100, melilite and secondary grossular, monticellite and forsterite (minerals replacing melilite) are similarly {sup 16}O-depleted, whereas grossular in CAI 160 is {sup 16}O-enriched ({Delta}{sup 17}O = -10{per_thousand} to -6{per_thousand}) relative to melilite ({Delta}{sup 17}O = -5{per_thousand} to -3{per_thousand}). We infer

  14. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  15. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  16. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  17. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  18. The Cognitive Assessment Interview (CAI): development and validation of an empirically derived, brief interview-based measure of cognition.

    Science.gov (United States)

    Ventura, Joseph; Reise, Steven P; Keefe, Richard S E; Baade, Lyle E; Gold, James M; Green, Michael F; Kern, Robert S; Mesholam-Gately, Raquelle; Nuechterlein, Keith H; Seidman, Larry J; Bilder, Robert M

    2010-08-01

    Practical, reliable "real world" measures of cognition are needed to supplement neurocognitive performance data to evaluate possible efficacy of new drugs targeting cognitive deficits associated with schizophrenia. Because interview-based measures of cognition offer one possible approach, data from the MATRICS initiative (n=176) were used to examine the psychometric properties of the Schizophrenia Cognition Rating Scale (SCoRS) and the Clinical Global Impression of Cognition in Schizophrenia (CGI-CogS). We used classical test theory methods and item response theory to derive the 10-item Cognitive Assessment Interview (CAI) from the SCoRS and CGI-CogS ("parent instruments"). Sources of information for CAI ratings included the patient and an informant. Validity analyses examined the relationship between the CAI and objective measures of cognitive functioning, intermediate measures of cognition, and functional outcome. The rater's score from the newly derived CAI (10 items) correlate highly (r=.87) with those from the combined set of the SCoRS and CGI-CogS (41 items). Both the patient (r=.82) and the informant (r=.95) data were highly correlated with the rater's score. The CAI was modestly correlated with objectively measured neurocognition (r=-.32), functional capacity (r=-.44), and functional outcome (r=-.32), which was comparable to the parent instruments. The CAI allows for expert judgment in evaluating a patient's cognitive functioning and was modestly correlated with neurocognitive functioning, functional capacity, and functional outcome. The CAI is a brief, repeatable, and potentially valuable tool for rating cognition in schizophrenia patients who are participating in clinical trials. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Computer aided information system for a PWR

    International Nuclear Information System (INIS)

    Vaidian, T.A.; Karmakar, G.; Rajagopal, R.; Shankar, V.; Patil, R.K.

    1994-01-01

    The computer aided information system (CAIS) is designed with a view to improve the performance of the operator. CAIS assists the plant operator in an advisory and support role, thereby reducing the workload level and potential human errors. The CAIS as explained here has been designed for a PWR type KLT- 40 used in Floating Nuclear Power Stations (FNPS). However the underlying philosophy evolved in designing the CAIS can be suitably adopted for other type of nuclear power plants too (BWR, PHWR). Operator information is divided into three broad categories: a) continuously available information b) automatically available information and c) on demand information. Two in number touch screens are provided on the main control panel. One is earmarked for continuously available information and the other is dedicated for automatically available information. Both the screens can be used at the operator's discretion for on-demand information. Automatically available information screen overrides the on-demand information screens. In addition to the above, CAIS has the features of event sequence recording, disturbance recording and information documentation. CAIS design ensures that the operator is not overburdened with excess and unnecessary information, but at the same time adequate and well formatted information is available. (author). 5 refs., 4 figs

  20. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  1. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  2. Isogeometric analysis : a calculus for computational mechanics

    NARCIS (Netherlands)

    Benson, D.J.; Borst, de R.; Hughes, T.J.R.; Scott, M.A.; Verhoosel, C.V.; Topping, B.H.V.; Adam, J.M.; Pallarés, F.J.; Bru, R.; Romero, M.L.

    2010-01-01

    The first paper on isogeometric analysis appeared only five years ago [1], and the first book appeared last year [2]. Progress has been rapid. Isogeometric analysis has been applied to a wide variety of problems in solids, fluids and fluid-structure interactions. Superior accuracy to traditional

  3. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  4. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  5. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  6. Computer training, present and future.

    Science.gov (United States)

    Smith, G. A.

    1972-01-01

    The products of educational firms today lead toward a multimedia approach to the education and training of commercial programmers and systems analysts. Computer-assisted instruction or CAI is a relatively new medium to augment the other media. The government use of computers is discussed together with the importance of computer pretests. Pretests can aid in determining a person's ability to absorb a particular instructional level. The material presented in a number of computer courses is listed.

  7. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  9. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  10. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  11. Complete genome sequence of Defluviimonas alba cai42T, a microbial exopolysaccharides producer.

    Science.gov (United States)

    Zhao, Jie-Yu; Geng, Shuang; Xu, Lian; Hu, Bing; Sun, Ji-Quan; Nie, Yong; Tang, Yue-Qin; Wu, Xiao-Lei

    2016-12-10

    Defluviimonas alba cai42 T , isolated from the oil-production water in Xinjiang Oilfield in China, has a strong ability to produce exopolysaccharides (EPS). We hereby present its complete genome sequence information which consists of a circular chromosome and three plasmids. The strain characteristically contains various genes encoding for enzymes involved in EPS biosynthesis, modification, and export. According to the genomic and physiochemical data, it is predicted that the strain has the potential to be utilized in industrial production of microbial EPS. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. On native Danish learners' challenges in distinguishing /tai/, /cai/ and /zai/

    DEFF Research Database (Denmark)

    Sloos, Marjoleine; Zhang, Chun

    2015-01-01

    University participated in an ABX experiment. They were auditorily presented pairs of the critical stimuli tai-cai-zai, te-ce-ze and tuo-cuo-zuo combined with all four tones and alternated with fillers. The subjects indicated for each pair which of the two words matched the pinyin description. The expected...... results show that beginner learners perform on chance level regarding the distinction between t and z and between c and z. The reason is that in Danish, which has an aspiration contrast between plosives (like Chinese) /th/ is variably pronounced as affricated /ts/ and many speakers are unaware...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  15. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  16. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  17. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  18. System Matrix Analysis for Computed Tomography Imaging

    Science.gov (United States)

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  19. Computational analysis of sequence selection mechanisms.

    Science.gov (United States)

    Meyerguz, Leonid; Grasso, Catherine; Kleinberg, Jon; Elber, Ron

    2004-04-01

    Mechanisms leading to gene variations are responsible for the diversity of species and are important components of the theory of evolution. One constraint on gene evolution is that of protein foldability; the three-dimensional shapes of proteins must be thermodynamically stable. We explore the impact of this constraint and calculate properties of foldable sequences using 3660 structures from the Protein Data Bank. We seek a selection function that receives sequences as input, and outputs survival probability based on sequence fitness to structure. We compute the number of sequences that match a particular protein structure with energy lower than the native sequence, the density of the number of sequences, the entropy, and the "selection" temperature. The mechanism of structure selection for sequences longer than 200 amino acids is approximately universal. For shorter sequences, it is not. We speculate on concrete evolutionary mechanisms that show this behavior.

  20. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  1. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  2. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  3. Computational Analysis of Spray Jet Flames

    Science.gov (United States)

    Jain, Utsav

    There is a boost in the utilization of renewable sources of energy but because of high energy density applications, combustion will never be obsolete. Spray combustion is a type of multiphase combustion which has tremendous engineering applications in different fields, varying from energy conversion devices to rocket propulsion system. Developing accurate computational models for turbulent spray combustion is vital for improving the design of combustors and making them energy efficient. Flamelet models have been extensively used for gas phase combustion because of their relatively low computational cost to model the turbulence-chemistry interaction using a low dimensional manifold approach. This framework is designed for gas phase non-premixed combustion and its implementation is not very straight forward for multiphase and multi-regime combustion such as spray combustion. This is because of the use of a conserved scalar and various flamelet related assumptions. Mixture fraction has been popularly employed as a conserved scalar and hence used to parameterize the characteristics of gaseous flamelets. However, for spray combustion, the mixture fraction is not monotonic and does not give a unique mapping in order to parameterize the structure of spray flames. In order to develop a flamelet type model for spray flames, a new variable called the mixing variable is introduced which acts as an ideal conserved scalar and takes into account the convection and evaporation of fuel droplets. In addition to the conserved scalar, it has been observed that though gaseous flamelets can be characterized by the conserved scalar and its dissipation, this might not be true for spray flamelets. Droplet dynamics has a significant influence on the spray flamelet and because of effects such as flame penetration of droplets and oscillation of droplets across the stagnation plane, it becomes important to accommodate their influence in the flamelet formulation. In order to recognize the

  4. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-01-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  5. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  6. Computed tomographic analysis of urinary calculi

    International Nuclear Information System (INIS)

    Naito, Akira; Ito, Katsuhide; Ito, Shouko

    1986-01-01

    Computed tomography (CT) was employed in an effort to analyze the chemical composition of urinary calculi. Twenty-three surgically removed calculi were scanned in a water bath (in vitro study). Forteen of them in the body were scanned (in vivo study). The calculi consisted of four types: mixed calcium oxalate and phosphate, mixed calcium carbonate and phosphate, magnesium ammonium phosphate, and uric acid. The in vitro study showed that the mean and maximum CT values of uric acid stones were significantly lower than those of the other three types of stones. This indicated that stones with less than 450 HU are composed of uric acid. In an in vivo study, CT did not help to differentiate the three types of urinary calculi, except for uric acid stones. Regarding the mean CT values, there was no correlation between in vitro and in vivo studies. An experiment with commercially available drugs showed that CT values of urinary calculi were not dependent upon the composition, but dependent upon the density of the calculi. (Namekawa, K.)

  7. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  8. Classification and Analysis of Computer Network Traffic

    OpenAIRE

    Bujlow, Tomasz

    2014-01-01

    Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models of traffic for academic purposes. We define the objective of this thesis as finding a way to evaluate the performance of various applications in a high-speed Internet infrastructure. To satisfy the obje...

  9. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  10. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  11. A Braça da Rede, uma Técnica Caiçara de Medir

    Directory of Open Access Journals (Sweden)

    Gilberto Chieus Jr.

    2009-08-01

    Full Text Available Este artigo relata como os caiçaras da cidade de Ubatuba litoral norte paulista medem suas redes de pesca.Mas antes de estar analisando sua técnica de medir estaremos fazendo uma pequena abordagem da cultura caiçara e suas transformações. Em seguida mostraremos alguns momentos históricos da construção do metro. Depois como os caiçaras medem suas redes e o problema ocorrido no Brasil na implantação do sistema métrico decimal e a resistência de determinadas civilizações que se utiliza de outros padrões para realizar suas medidas, ignorando o atual sistema métrico, devidos o seu contexto cultural. Toda esta discussão está enfocada numa perspectiva histórica da Etnomatemática.

  12. Cafts: computer aided fault tree analysis

    International Nuclear Information System (INIS)

    Poucet, A.

    1985-01-01

    The fault tree technique has become a standard tool for the analysis of safety and reliability of complex system. In spite of the costs, which may be high for a complete and detailed analysis of a complex plant, the fault tree technique is popular and its benefits are fully recognized. Due to this applications of these codes have mostly been restricted to simple academic examples and rarely concern complex, real world systems. In this paper an interactive approach to fault tree construction is presented. The aim is not to replace the analyst, but to offer him an intelligent tool which can assist him in modeling complex systems. Using the CAFTS-method, the analyst interactively constructs a fault tree in two phases: (1) In a first phase he generates an overall failure logic structure of the system; the macrofault tree. In this phase, CAFTS features an expert system approach to assist the analyst. It makes use of a knowledge base containing generic rules on the behavior of subsystems and components; (2) In a second phase the macrofault tree is further refined and transformed in a fully detailed and quantified fault tree. In this phase a library of plant-specific component failure models is used

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  15. Student Study Choices in the Principles of Economics: A Case Study of Computer Usage

    OpenAIRE

    Grimes, Paul W.; Sanderson, Patricia L.; Ching, Geok H.

    1996-01-01

    Principles of Economics students at Mississippi State University were provided the opportunity to use computer assisted instruction (CAI) as a supplemental study activity. Students were free to choose the extent of their computer work. Throughout the course, weekly surveys were conducted to monitor the time each student spent with their textbook, computerized tutorials, workbook, class notes, and study groups. The surveys indicated that only a minority of the students actively pursued CAI....

  16. Ca-Fe and Alkali-Halide Alteration of an Allende Type B CAI: Aqueous Alteration in Nebular or Asteroidal Settings

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Simon, S. B.; Grossman, L.

    2012-01-01

    Ca-Fe and alkali-halide alteration of CAIs is often attributed to aqueous alteration by fluids circulating on asteroidal parent bodies after the various chondritic components have been assembled, although debate continues about the roles of asteroidal vs. nebular modification processes [1-7]. Here we report de-tailed observations of alteration products in a large Type B2 CAI, TS4 from Allende, one of the oxidized subgroup of CV3s, and propose a speculative model for aqueous alteration of CAIs in a nebular setting. Ca-Fe alteration in this CAI consists predominantly of end-member hedenbergite, end-member andradite, and compositionally variable, magnesian high-Ca pyroxene. These phases are strongly concentrated in an unusual "nodule" enclosed within the interior of the CAI (Fig. 1). The Ca, Fe-rich nodule superficially resembles a clast that pre-dated and was engulfed by the CAI, but closer inspection shows that relic spinel grains are enclosed in the nodule, and corroded CAI primary phases interfinger with the Fe-rich phases at the nodule s margins. This CAI also contains abundant sodalite and nepheline (alkali-halide) alteration that occurs around the rims of the CAI, but also penetrates more deeply into the CAI. The two types of alteration (Ca-Fe and alkali-halide) are adjacent, and very fine-grained Fe-rich phases are associated with sodalite-rich regions. Both types of alteration appear to be replacive; if that is true, it would require substantial introduction of Fe, and transport of elements (Ti, Al and Mg) out of the nodule, and introduction of Na and Cl into alkali-halide rich zones. Parts of the CAI have been extensively metasomatized.

  17. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  18. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  19. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  20. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  1. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  2. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  3. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  4. Microstructures of Hibonite From an ALH A77307 (CO3.0) CAI: Evidence for Evaporative Loss of Calcium

    Science.gov (United States)

    Han, Jangmi; Brearley, Adrian J.; Keller, Lindsay P.

    2014-01-01

    Hibonite is a comparatively rare, primary phase found in some CAIs from different chondrite groups and is also common in Wark-Lovering rims [1]. Hibonite is predicted to be one of the earliest refractory phases to form by equilibrium condensation from a cooling gas of solar composition [2] and, therefore, can be a potential recorder of very early solar system processes. In this study, we describe the microstructures of hibonite from one CAI in ALH A77307 (CO3.0) using FIB/TEM techniques in order to reconstruct its formational history.

  5. Calcium and Titanium Isotope Fractionation in CAIS: Tracers of Condensation and Inheritance in the Early Solar Protoplanetary Disk

    Science.gov (United States)

    Simon, J. I.; Jordan, M. K.; Tappa, M. J.; Kohl, I. E.; Young, E. D.

    2016-01-01

    The chemical and isotopic compositions of calcium-aluminum-rich inclusions (CAIs) can be used to understand the conditions present in the protoplantary disk where they formed. The isotopic compositions of these early-formed nebular materials are largely controlled by chemical volatility. The isotopic effects of evaporation/sublimation, which are well explained by both theory and experimental work, lead to enrichments of the heavy isotopes that are often exhibited by the moderately refractory elements Mg and Si. Less well understood are the isotopic effects of condensation, which limits our ability to determine whether a CAI is a primary condensate and/or retains any evidence of its primordial formation history.

  6. Using the Computer to Improve Basic Skills.

    Science.gov (United States)

    Bozeman, William; Hierstein, William J.

    These presentations offer information on the benefits of using computer-assisted instruction (CAI) for remedial education. First, William J. Hierstein offers a summary of the Computer Assisted Basic Skills Project conducted by Southeastern Community College at the Iowa State Penitentiary. Hierstein provides background on the funding for the…

  7. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  8. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  9. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  10. Computer programs for analysis of geophysical data

    International Nuclear Information System (INIS)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  11. Introducing remarks upon the analysis of computer systems performance

    International Nuclear Information System (INIS)

    Baum, D.

    1980-05-01

    Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de

  12. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  13. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  14. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  15. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  16. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  17. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  18. The effects of computer-assisted instruction on the mathematics performance and classroom behavior of children with ADHD.

    Science.gov (United States)

    Mautone, Jennifer A; DuPaul, George J; Jitendra, Asha K

    2005-08-01

    The present study examines the effects of computer-assisted instruction (CAI) on the mathematics performance and classroom behavior of three second-through fourth-grade students with ADHD. A controlled case study is used to evaluate the effects of the computer software on participants' mathematics performance and on-task behavior. Participants' mathematics achievement improve and their on-task behavior increase during the CAI sessions relative to independent seatwork conditions. In addition, students and teachers consider CAI to be an acceptable intervention for some students with ADHD who are having difficulty with mathematics. Implications of these results for practice and research are discussed.

  19. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  20. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  1. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  2. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  3. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  4. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  5. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  6. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  7. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  8. Analysis of Biosignals During Immersion in Computer Games.

    Science.gov (United States)

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  9. Two years since SSAMS: Status of {sup 14}C AMS at CAIS

    Energy Technology Data Exchange (ETDEWEB)

    Ravi Prasad, G.V.; Cherkinsky, Alexander; Culp, Randy A.; Dvoracek, Doug K.

    2015-10-15

    The NEC 250 kV single stage AMS accelerator (SSAMS) was installed two years ago at the Center for Applied Isotope Studies (CAIS), University of Georgia. The accelerator is primarily being used for radiocarbon measurements to test the authenticity of natural and bio-based samples while all other samples such as geological, atmospheric, marine and archaeological. are run on the 500 kV, NEC 1.5SDH-1 model tandem accelerator, which has been operating since 2001. The data obtained over a six months period for OXI, OXII, ANU sucrose and FIRI-D are discussed. The mean value of ANU sucrose observed to be slightly lower than the consensus value. The processed blanks on SSAMS produce lower apparent age compared to the tandem accelerator as expected.

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  11. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil

    Directory of Open Access Journals (Sweden)

    Alves Rômulo RN

    2009-11-01

    Full Text Available Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers, with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 native residents. Data were checked through systematic observations and collection of zoological material. Results The dependence on the terrestrial fauna by Caiçaras is especially for food and medicine. The main species used are Didelphis spp., Dasyprocta azarae, Dasypus novemcinctus, and small birds (several species of Turdidae. Contrasting with a high dependency on terrestrial fauna resources by native Amazonians, the Caiçaras do not show a constant dependency on these resources. Nevertheless, the occasional hunting of native animals represents a complimentary source of animal protein. Conclusion Indigenous or local knowledge on native resources is important in order to promote local development in a sustainable way, and can help to conserve biodiversity, particularly if the resource is sporadically used and not commercially exploited.

  12. From Corporate Social Responsibility, through Entrepreneurial Orientation, to Knowledge Sharing: A Study in Cai Luong (Renovated Theatre) Theatre Companies

    Science.gov (United States)

    Tuan, Luu Trong

    2015-01-01

    Purpose: This paper aims to examine the role of antecedents such as corporate social responsibility (CSR) and entrepreneurial orientation in the chain effect to knowledge sharing among members of Cai Luong theatre companies in the Vietnamese context. Knowledge sharing contributes to the depth of the knowledge pool of both the individuals and the…

  13. Changes in flavour and microbial diversity during natural fermentation of suan-cai, a traditional food made in Northeast China.

    Science.gov (United States)

    Wu, Rina; Yu, Meiling; Liu, Xiaoyu; Meng, Lingshuai; Wang, Qianqian; Xue, Yating; Wu, Junrui; Yue, Xiqing

    2015-10-15

    We measured changes in the main physical and chemical properties, flavour compounds and microbial diversity in suan-cai during natural fermentation. The results showed that the pH and concentration of soluble protein initially decreased but were then maintained at a stable level; the concentration of nitrite increased in the initial fermentation stage and after reaching a peak it decreased significantly to a low level by the end of fermentation. Suan-cai was rich in 17 free amino acids. All of the free amino acids increased in concentration to different degrees, except histidine. Total free amino acids reached their highest levels in the mid-fermentation stage. The 17 volatile flavour components identified at the start of fermentation increased to 57 by the mid-fermentation stage; esters and aldehydes were in the greatest diversity and abundance, contributing most to the aroma of suan-cai. Bacteria were more abundant and diverse than fungi in suan-cai; 14 bacterial species were identified from the genera Leuconostoc, Bacillus, Pseudomonas and Lactobacillus. The predominant fungal species identified were Debaryomyces hansenii, Candida tropicalis and Penicillium expansum. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Multiple Nebular Gas Reservoirs Recorded by Oxygen Isotope Variation in a Spinel-Rich CAI in CO3 MIL 090019

    Science.gov (United States)

    Simon, J. I.; Simon, S. B.; Nguyen, A. N.; Ross, D. K.; Messenger, S.

    2017-07-01

    We conducted NanoSIMS ion imaging studies of a primitive spinel-rich CAI from the MIL 090019 CO3 chondrite. It records radial O-isotopic heterogeneity among multiple occurrences of the same mineral, reflecting distinct nebular O-isotopic reservoirs.

  15. Dietary Changes over Time in a Caiçara Community from the Brazilian Atlantic Forest

    Directory of Open Access Journals (Sweden)

    Priscila L. MacCord

    2006-12-01

    Full Text Available Because they are occurring at an accelerated pace, changes in the livelihoods of local coastal communities, including nutritional aspects, have been a subject of interest in human ecology. The aim of this study is to explore the dietary changes, particularly in the consumption of animal protein, that have taken place in Puruba Beach, a rural community of caiçaras on the São Paulo Coast, Brazil, over the 10-yr period from 1992-1993 to 2002-2003. Data were collected during six months in 1992-1993 and during the same months in 2002-2003 using the 24-hr recall method. We found an increasing dependence on external products in the most recent period, along with a reduction in fish consumption and in the number of fish species eaten. These changes, possibly associated with other nonmeasured factors such as overfishing and unplanned tourism, may cause food delocalization and a reduction in the use of natural resources. Although the consequences for conservation efforts in the Atlantic Forest and the survival of the caiçaras must still be evaluated, these local inhabitants may be finding a way to reconcile both the old and the new dietary patterns by keeping their houses in the community while looking for sources of income other than natural resources. The prospect shown here may reveal facets that can influence the maintenance of this and other communities undergoing similar processes by, for example, shedding some light on the ecological and economical processes that may occur within their environment and in turn affect the conservation of the resources upon which the local inhabitants depend.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  17. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  18. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  19. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  20. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  1. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  2. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  3. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  4. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  5. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  6. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  7. Computational content analysis of European Central Bank statements

    NARCIS (Netherlands)

    Milea, D.V.; Almeida, R.J.; Sharef, N.M.; Kaymak, U.; Frasincar, F.

    2012-01-01

    In this paper we present a framework for the computational content analysis of European Central Bank (ECB) statements. Based on this framework, we provide two approaches that can be used in a practical context. Both approaches use the content of ECB statements to predict upward and downward movement

  8. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  9. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  10. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  11. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  12. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  13. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  14. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  15. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  16. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  17. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  20. The Range of Initial 10Be/9Be Ratios in the Early Solar System: A Re-Assessment Based on Analyses of New CAIs and Melilite Composition Glass Standards

    Science.gov (United States)

    Dunham, E.; Wadhwa, M.; Liu, M.-C.

    2017-07-01

    We report a more accurate range of initial 10Be/9Be in CAIs including FUN CAI CMS-1 from Allende (CV3) and a new CAI from NWA 5508 (CV3) using melilite composition glass standards; we suggest 10Be is largely produced by irradiation in the nebula.

  1. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  2. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  3. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  4. Developing Computer-Assisted Instruction Multimedia For Educational Technology Course of Coastal Area Students

    Science.gov (United States)

    Idris, Husni; Nurhayati, Nurhayati; Satriani, Satriani

    2018-05-01

    This research aims to a) identify instructional software (interactive multimedia CDs) by developing Computer-Assisted Instruction (CAI) multimedia that is eligible to be used in the instruction of the Educational Technology course; b) analysis the role of instructional software (interactive multimedia CDs) on the Educational Technology course through the development of Computer-Assisted Instruction (CAI) multimedia to improve the quality of education and instructional activities. This is Research and Development (R&D). It employed the descriptive procedural model of development, which outlines the steps to be taken to develop a product, which is instructional multimedia. The number of subjects of the research trial or respondents for each stage was 20 people. To maintain development quality, an expert in materials outside the materials under study, an expert in materials who is also a Educational Technology lecturer, a small groupof 3 students, a medium-sized group of 10 students, and 20 students to participate in the field testing took part in this research. Then, data collection instruments were developed in two stages, namely: a) developing the instruments; and b) trying out instruments. Data on students’ responses were collected using questionnaires and analyzed using descriptive statistics with percentage and categorization techniques. Based on data analysis results, it is revealed that the Computer-Assisted Instruction (CAI) multimedia developed and tried out among students during the preliminary field testing falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Subsequently, results of the main field testing among students also suggest that it falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Similarly, results of the operational field testing among students also suggest that it falls into the

  5. Dietetics students' ability to choose appropriate communication and counseling methods is improved by teaching behavior-change strategies in computer-assisted instruction.

    Science.gov (United States)

    Puri, Ruchi; Bell, Carol; Evers, William D

    2010-06-01

    Several models and theories have been proposed to help registered dietitians (RD) counsel and communicate nutrition information to patients. However, there is little time for students or interns to observe and/or participate in counseling sessions. Computer-assisted instruction (CAI) can be used to give students more opportunity to observe the various methods and theories of counseling. This study used CAI simulations of RD-client communications to examine whether students who worked through the CAI modules would choose more appropriate counseling methods. Modules were created based on information from experienced RD. They contained videos of RD-patient interactions and demonstrated helpful and less helpful methods of communication. Students in didactic programs in dietetics accessed the modules via the Internet. The intervention group of students received a pretest module, two tutorial modules, and a posttest module. The control group only received the pretest and posttest modules. Data were collected during three semesters in 2006 and 2007. Two sample t tests were used to compare pretest and posttest scores. The influence of other factors was measured using factorial analysis of variance. Statistical significance was set at Pcommunication and counseling methods for dietetics students. 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  6. Computer-assisted instruction: a library service for the community teaching hospital.

    Science.gov (United States)

    McCorkel, J; Cook, V

    1986-04-01

    This paper reports on five years of experience with computer-assisted instruction (CAI) at Winthrop-University Hospital, a major affiliate of the SUNY at Stony Brook School of Medicine. It compares CAI programs available from Ohio State University and Massachusetts General Hospital (accessed by telephone and modem), and software packages purchased from the Health Sciences Consortium (MED-CAPS) and Scientific American (DISCOTEST). The comparison documents one library's experience of the cost of these programs and the use made of them by medical students, house staff, and attending physicians. It describes the space allocated for necessary equipment, as well as the marketing of CAI. Finally, in view of the decision of the National Board of Medical Examiners to administer the Part III examination on computer (the so-called CBX) starting in 1988, the paper speculates on the future importance of CAI in the community teaching hospital.

  7. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  8. Available computer codes and data for radiation transport analysis

    International Nuclear Information System (INIS)

    Trubey, D.K.; Maskewitz, B.F.; Roussin, R.W.

    1975-01-01

    The Radiation Shielding Information Center (RSIC), sponsored and supported by the Energy Research and Development Administration (ERDA) and the Defense Nuclear Agency (DNA), is a technical institute serving the radiation transport and shielding community. It acquires, selects, stores, retrieves, evaluates, analyzes, synthesizes, and disseminates information on shielding and ionizing radiation transport. The major activities include: (1) operating a computer-based information system and answering inquiries on radiation analysis, (2) collecting, checking out, packaging, and distributing large computer codes, and evaluated and processed data libraries. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  9. Computational analysis in support of the SSTO flowpath test

    Science.gov (United States)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-10-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  10. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  11. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  12. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  13. Application of computer aided tolerance analysis in product design

    International Nuclear Information System (INIS)

    Du Hua

    2009-01-01

    This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  15. Computational image analysis of Suspension Plasma Sprayed YSZ coatings

    Directory of Open Access Journals (Sweden)

    Michalak Monika

    2017-01-01

    Full Text Available The paper presents the computational studies of microstructure- and topography- related features of suspension plasma sprayed (SPS coatings of yttria-stabilized zirconia (YSZ. The study mainly covers the porosity assessment, provided by ImageJ software analysis. The influence of boundary conditions, defined by: (i circularity and (ii size limits, on the computed values of porosity is also investigated. Additionally, the digital topography evaluation is performed: confocal laser scanning microscope (CLSM and scanning electron microscope (SEM operating in Shape from Shading (SFS mode measure surface roughness of deposited coatings. Computed values of porosity and roughness are referred to the variables of the spraying process, which influence the morphology of coatings and determines the possible fields of their applications.

  16. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    Yoneyama, Mitsuru; Kubota, Ryuji; Fujiwara, Tadashi; Sakuma, Hitoshi

    1999-01-01

    The education and training system for Accident Management was developed by the Japanese BWR group and Hitachi Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the education and training system with computer simulations. Both systems are designed to be executed on personal computers. The outlines of the CAI education system and the education and training system with simulator are reported below. These systems provides plant operators and technical support center staff with the effective education and training for accident management. (author)

  18. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  19. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  2. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  3. Gas analysis by computer-controlled microwave rotational spectrometry

    International Nuclear Information System (INIS)

    Hrubesh, L.W.

    1978-01-01

    Microwave rotational spectrometry has inherently high resolution and is thus nearly ideal for qualitative gas mixture analysis. Quantitative gas analysis is also possible by a simplified method which utilizes the ease with which molecular rotational transitions can be saturated at low microwave power densities. This article describes a computer-controlled microwave spectrometer which is used to demonstrate for the first time a totally automated analysis of a complex gas mixture. Examples are shown for a complete qualitative and quantitative analysis, in which a search of over 100 different compounds is made in less than 7 min, with sensitivity for most compounds in the 10 to 100 ppm range. This technique is expected to find increased use in view of the reduced complexity and increased reliabiity of microwave spectrometers and because of new energy-related applications for analysis of mixtures of small molecules

  4. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  5. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    Science.gov (United States)

    Adib, M. A. H. M.; Adnan, F.; Ismail, A. R.; Kardigama, K.; Salaam, H. A.; Ahmad, Z.; Johari, N. H.; Anuar, Z.; Azmi, N. S. N.

    2012-09-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ~ 60%) acceptable compared to diffuser with 6mm ~ 40% and 12mm ~ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  6. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    International Nuclear Information System (INIS)

    Adib, M A H M; Ismail, A R; Kardigama, K; Salaam, H A; Ahmad, Z; Johari, N H; Anuar, Z; Azmi, N S N; Adnan, F

    2012-01-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ∼ 60%) acceptable compared to diffuser with 6mm ∼ 40% and 12mm ∼ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  7. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  8. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    Science.gov (United States)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  9. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  10. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  11. A Computable OLG Model for Gender and Growth Policy Analysis

    OpenAIRE

    Pierre-Richard Agénor

    2012-01-01

    This paper develops a computable Overlapping Generations (OLG) model for gender and growth policy analysis. The model accounts for human and physical capital accumulation (both public and private), intra- and inter-generational health persistence, fertility choices, and women's time allocation between market work, child rearing, and home production. Bargaining between spouses and gender bias, in the form of discrimination in the work place and mothers' time allocation between daughters and so...

  12. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  13. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  17. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  18. A visual interface to computer programs for linkage analysis.

    Science.gov (United States)

    Chapman, C J

    1990-06-01

    This paper describes a visual approach to the input of information about human families into computer data bases, making use of the GEM graphic interface on the Atari ST. Similar approaches could be used on the Apple Macintosh or on the IBM PC AT (to which it has been transferred). For occasional users of pedigree analysis programs, this approach has considerable advantages in ease of use and accessibility. An example of such use might be the analysis of risk in families with Huntington disease using linked RFLPs. However, graphic interfaces do make much greater demands on the programmers of these systems.

  19. Advances in Computational Stability Analysis of Composite Aerospace Structures

    International Nuclear Information System (INIS)

    Degenhardt, R.; Araujo, F. C. de

    2010-01-01

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  20. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  1. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  2. ASAS: Computational code for Analysis and Simulation of Atomic Spectra

    Directory of Open Access Journals (Sweden)

    Jhonatha R. dos Santos

    2017-01-01

    Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.

  3. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...

  4. Establishment of computer code system for nuclear reactor design - analysis

    International Nuclear Information System (INIS)

    Subki, I.R.; Santoso, B.; Syaukat, A.; Lee, S.M.

    1996-01-01

    Establishment of computer code system for nuclear reactor design analysis is given in this paper. This establishment is an effort to provide the capability in running various codes from nuclear data to reactor design and promote the capability for nuclear reactor design analysis particularly from neutronics and safety points. This establishment is also an effort to enhance the coordination of nuclear codes application and development existing in various research centre in Indonesia. Very prospective results have been obtained with the help of IAEA technical assistance. (author). 6 refs, 1 fig., 1 tab

  5. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  6. Computer-Assisted Instruction and Continuing Motivation.

    Science.gov (United States)

    Mosley, Mary Lou; And Others

    Effects of two feedback conditions--comment and no comment--on the motivation of sixth grade students to continue with computer assisted instruction (CAI) were investigated, and results for boys and for girls were compared. Subjects were 62 students--29 boys and 33 girls--from a suburban elementary school who were randomly assigned to the comment…

  7. Computer assessment of interview data using latent semantic analysis.

    Science.gov (United States)

    Dam, Gregory; Kaufmann, Stefan

    2008-02-01

    Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed.

  8. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  9. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  10. Crystal structures of coordination polymers from CaI2 and proline

    Directory of Open Access Journals (Sweden)

    Kevin Lamberts

    2015-06-01

    Full Text Available Completing our reports concerning the reaction products from calcium halides and the amino acid proline, two different solids were found for the reaction of l- and dl-proline with CaI2. The enantiopure amino acid yields the one-dimensional coordination polymer catena-poly[[aqua-μ3-l-proline-tetra-μ2-l-proline-dicalcium] tetraiodide 1.7-hydrate], {[Ca2(C5H9NO25(H2O]I4·1.7H2O}n, (1, with two independent Ca2+ cations in characteristic seven- and eightfold coordination. Five symmetry-independent zwitterionic l-proline molecules bridge the metal sites into a cationic polymer. Racemic proline forms with Ca2+ cations heterochiral chains of the one-dimensional polymer catena-poly[[diaquadi-μ2-dl-proline-calcium] diiodide], {[Ca(C5H9NO22(H2O2]I2}n, (2. The centrosymmetric structure is built by one Ca2+ cation that is bridged towards its symmetry equivalents by two zwitterionic proline molecules. In both structures, the iodide ions remain non-coordinating and hydrogen bonds are formed between these counter-anions, the amino groups, coordinating and co-crystallized water molecules. While the overall composition of (1 and (2 is in line with other structures from calcium halides and amino acids, the diversity of the carboxylate coordination geometry is quite surprising.

  11. Bird community structure in riparian environments in Cai River, Rio Grande do Sul, Brazil

    Directory of Open Access Journals (Sweden)

    Jaqueline Brummelhaus

    2012-06-01

    Full Text Available Urbanization produces changes in riparian environments, causing effects in the structure of bird communities, which present different responses to the impacts. We compare species richness, abundance, and composition of birds in riparian environments with different characteristics in Cai River, Rio Grande do Sul, Brazil. We carried out observations in woodland, grassland, and urban environments, between September 2007 and August 2008. We listed 130 bird species, 29 species unique to woodland environment, and an endangeredspecies: Triclaria malachitacea. Bird abundance differed from woodland (n = 426 individuals to urban environments (n = 939 individuals (F2,6 = 7.315; P = 0.025. Species composition and feeding guilds differed significantly in the bird community structures among these three riparian environments. In the grassland and urban environments there were more generalist insectivorous species, while in the woodland environments we find more leaf and trunk insectivorous species and frugivorous species, sensitive to human impacts. Bird species can be biological quality indicators and they contribute to ecosystems performing relevant functions. With the knowledge on bird community structure and their needs, it is possible to implement management practices for restoration of degraded riparian environments.

  12. Changes of Benthic Macroinvertebrates in Thi Vai River and Cai Mep Estuaries Under Polluted Conditions with Industrial Wastewater

    Directory of Open Access Journals (Sweden)

    Huong Nguyen Thi Thanh

    2017-06-01

    Full Text Available The pollution on the Thi Vai River has been spreading out rapidly over the two lasted decades caused by the wastewater from the industrial parks in the left bank of Thi Vai River and Cai Mep Estuaries. The evaluation of the benthic macroinvertebrate changes was very necessary to identify the consequences of the industrial wastewater on water quality and aquatic ecosystem of Thi Vai River and Cai Mep Estuaries. In this study, the variables of benthic macroinvertebrates and water quality were investigated in Thi Vai River and Cai Mep Estuaries, Southern Vietnam. The monitoring data of benthic macroinvertebrates and water quality parameters covered the period from 1989 to 2015 at 6 sampling sites in Thi Vai River and Cai Mep Estuaries. The basic water quality parameters were also tested including pH, dissolved oxygen (DO, total nitrogen, and total phosphorus. The biodiversity indices of benthic macroinvertebrates were applied for water quality assessment. The results showed that pH ranged from 6.4 – 7.6 during the monitoring. The DO concentrations were in between 0.20 - 6.70 mg/L. The concentrations of total nitrogen and total phosphorous ranged from 0.03 - 5.70 mg/L 0.024 - 1.380 mg/L respectively. Macroinvertebrate community in the study area consisted of 36 species of polychaeta, gastropoda, bivalvia, and crustacea, of which, species of polychaeta were dominant in species number. The benthic macroinvertebartes density ranged from 0 - 2.746 individuals/m−1 with the main dominant species of Neanthes caudata, Prionospio malmgreni, Paraprionospio pinnata, Trichochaeta carica, Maldane sarsi, Capitella capitata, Terebellides stroemi, Euditylia polymorpha, Grandidierella lignorum, Apseudes vietnamensis. The biodiversity index values during the monitoring characterized for aquatic environmental conditions of mesotrophic to polytrophic. Besides, species richness positively correlated with DO, total nitrogen, and total phosphorus. The results

  13. Analysis of multigrid methods on massively parallel computers: Architectural implications

    Science.gov (United States)

    Matheson, Lesley R.; Tarjan, Robert E.

    1993-01-01

    We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.

  14. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  15. Analysis of sponge zones for computational fluid mechanics

    International Nuclear Information System (INIS)

    Bodony, Daniel J.

    2006-01-01

    The use of sponge regions, or sponge zones, which add the forcing term -σ(q - q ref ) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer

  16. Computer image analysis of etched tracks from ionizing radiation

    Science.gov (United States)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  17. Students' perceptions of a multimedia computer-aided instruction ...

    African Journals Online (AJOL)

    Objective. To develop an interactive muttimedia-based computer-aided instruction (CAI) programme, to detennine its educational worth and efficacy in a multicuttural academic environment and to evaluate its usage by students with differing levels of computer literacy. Design. A prospective descriptive study evaluating ...

  18. Computational analysis of the SRS Phase III salt disposition alternatives

    International Nuclear Information System (INIS)

    Dimenna, R.A.

    2000-01-01

    In late 1997, the In-Tank Precipitation (ITP), facility was shut down and an evaluation of alternative methods to process the liquid high-level waste stored in the Savannah River Site High-Level Waste storage tanks was begun. The objective was to determine whether another process might avoid the operational difficulties encountered with ITP for a lower cost than modifying the existing structured approach to evaluating proposed alternatives on a common basis to identify the best one. Results from the computational analysis were a key part of the input used to select a primary and a secondary salt disposition alternative. This paper describes the process by which the computation needs were identified, addressed, and accomplished with a limited staff under stringent schedule constraints

  19. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  20. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  1. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    Ginevan, M.

    1984-09-01

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  2. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    Wenzel, D.R.

    1994-02-01

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  3. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  4. Numerical analysis of boosting scheme for scalable NMR quantum computation

    International Nuclear Information System (INIS)

    SaiToh, Akira; Kitagawa, Masahiro

    2005-01-01

    Among initialization schemes for ensemble quantum computation beginning at thermal equilibrium, the scheme proposed by Schulman and Vazirani [in Proceedings of the 31st ACM Symposium on Theory of Computing (STOC'99) (ACM Press, New York, 1999), pp. 322-329] is known for the simple quantum circuit to redistribute the biases (polarizations) of qubits and small time complexity. However, our numerical simulation shows that the number of qubits initialized by the scheme is rather smaller than expected from the von Neumann entropy because of an increase in the sum of the binary entropies of individual qubits, which indicates a growth in the total classical correlation. This result--namely, that there is such a significant growth in the total binary entropy--disagrees with that of their analysis

  5. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  6. Analysis of pellet coating uniformity using a computer scanner.

    Science.gov (United States)

    Šibanc, Rok; Luštrik, Matevž; Dreu, Rok

    2017-11-30

    A fast method for pellet coating uniformity analysis, using a commercial computer scanner was developed. The analysis of the individual particle coating thicknesses was based on using a transparent orange colored coating layer deposited on white pellet cores. Besides the analysis of the coating thickness the information of pellet size and shape was obtained as well. Particle size dependent coating thickness and particle size independent coating variability was calculated by combining the information of coating thickness and pellet size. Decoupling coating thickness variation sources is unique to presented method. For each coating experiment around 10000 pellets were analyzed, giving results with a high statistical confidence. Proposed method was employed for the performance evaluation of classical Wurster and swirl enhanced Wurster coater operated at different gap settings and air flow rates. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  8. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  9. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  10. Computational Fatigue Life Analysis of Carbon Fiber Laminate

    Science.gov (United States)

    Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.

    2018-02-01

    In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  13. Computer content analysis of schizophrenic speech: a preliminary report.

    Science.gov (United States)

    Tucker, G J; Rosenberg, S D

    1975-06-01

    Computer analysis significantly differtiated the thermatic content of the free speech of 10 schizophrenic patients from that of 10 nonschizophrenic patients and from the content of transcripts of dream material from 10 normal subjects. Schizophrenic patients used the thematic categories in factor 1 (the "schizophrenic factor") 3 times more frequently than the nonschizophrenics and 10 times more frequently than the normal subjects (p smaller than 01). In general, the language content of the schizophrenic patient mirrored an almost agitated attempt to locate oneself in time and space and to defend against internal discomfort and confusion. The authors discuss the implications of this study for future research.

  14. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  15. Integrated computer codes for nuclear power plant severe accident analysis

    International Nuclear Information System (INIS)

    Jordanov, I.; Khristov, Y.

    1995-01-01

    This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs

  16. Integrated computer codes for nuclear power plant severe accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jordanov, I; Khristov, Y [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika

    1996-12-31

    This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs.

  17. RADTRAN 5: A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air

  18. Computer Tomography Analysis of Fastrac Composite Thrust Chamber Assemblies

    Science.gov (United States)

    Beshears, Ronald D.

    2000-01-01

    Computed tomography (CT) inspection has been integrated into the production process for NASA's Fastrac composite thrust chamber assemblies (TCAs). CT has been proven to be uniquely qualified to detect the known critical flaw for these nozzles, liner cracks that are adjacent to debonds between the liner and overwrap. CT is also being used as a process monitoring tool through analysis of low density indications in the nozzle overwraps. 3d reconstruction of CT images to produce models of flawed areas is being used to give program engineers better insight into the location and nature of nozzle flaws.

  19. Development validation and use of computer codes for inelastic analysis

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    A finite element scheme is a system which provides routines so carry out the operations which are common to all finite element programs. The list of items that can be provided as standard by the finite element scheme is surprisingly large and the list provided by the UNCLE finite element scheme is unusually comprehensive. This presentation covers the following: construction of the program, setting up a finite element mesh, generation of coordinates, incorporating boundary and load conditions. Program validation was done by creep calculations performed using CAUSE code. Program use is illustrated by calculating a typical inelastic analysis problem. This includes computer model of the PFR intermediate heat exchanger

  20. DYNAPO 4 - a fluid system and frames analysis computer program

    International Nuclear Information System (INIS)

    Lefter, J.D.; Ahdout, H.

    1982-01-01

    DYNAPO 4 is a user oriented specialized computer program, capable of analyzing three-dimensional linear elastic piping systems or frames for static loads, dynamic loads represented by acceleration response spectra, transient dynamic loads represented by harmonic, polynomial of second order, and time history forcing functions. DYNAPO 4 has plotting capability, which plots the input configuration of the piping system or of the structure and also plots its deformed shape after the load is applied. DYNAPO 4 performs the analysis for ASME Section III Class 1, Class 2, and 3, piping, and provides the user with stress reports as per ASME and ANSI Code requirements. 3 refs

  1. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  2. A computer program for automatic gamma-ray spectra analysis

    International Nuclear Information System (INIS)

    Hiromura, Kazuyuki

    1975-01-01

    A computer program for automatic analysis of gamma-ray spectra obtained with a Ge(Li) detector is presented. The program includes a method by comparing the successive values of experimental data for the automatic peak finding and method of leastsquares for the peak fitting. The peak shape in the fitting routine is a 'modified Gaussian', which consists of two different Gaussians with the same height joined at the centroid. A quadratic form is chosen as a function representing the background. A maximum of four peaks can be treated in the fitting routine by the program. Some improvements in question are described. (auth.)

  3. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  4. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  6. Cepstrum analysis and applications to computational fluid dynamic solutions

    Science.gov (United States)

    Meadows, Kristine R.

    1990-04-01

    A novel approach to the problem of spurious reflections introduced by artificial boundary conditions in computational fluid dynamic (CFD) solutions is proposed. Instead of attempting to derive non-reflecting boundary conditions, the approach is to accept the fact that spurious reflections occur, but to remove these reflections with cepstrum analysis, a signal processing technique which has been successfully used to remove echoes from experimental data. First, the theory of the cepstrum method is presented. This includes presentation of two types of cepstra: The Power Cepstrum and the Complex Cepstrum. The definitions of the cepstrum methods are applied theoretically and numerically to the analytical solution of sinusoidal plane wave propagation in a duct. One-D and 3-D time dependent solutions to the Euler equations are computed, and hard-wall conditions are prescribed at the numerical boundaries. The cepstrum method is applied, and the reflections from the boundaries are removed from the solutions. One-D and 3-D solutions are computed with so called nonreflecting boundary conditions, and these solutions are compared to those obtained by prescribing hard wall conditions and processing with the cepstrum.

  7. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Wang, Aolin; Arah, Onyebuchi A.

    2015-01-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  8. System and software safety analysis for the ERA control computer

    International Nuclear Information System (INIS)

    Beerthuizen, P.G.; Kruidhof, W.

    2001-01-01

    The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used

  9. Intra-articualr calcaneal fractures: Computed tomographic analysis

    International Nuclear Information System (INIS)

    Rosenberg, Z.S.; Feldman, F.; Singson, R.D.

    1987-01-01

    Computed tomography (CT) analysis of 21 intra-articular calcaneal fractures categorized according to the Essex-Lopresti classification revealed the following distribution: joint depression-type 57%, comminuted type 43%, tongue-type 0%. The posterior calcaneal facet was fractured and/or depressed in 100% of the cases while the medial facet was involved in only 25% of the cases. CT proved superior to plain films by consistently demonstrating additional fracture components within each major category suggesting subclassifications which have potential prognostic value. CT allowed more expeditious handling of acutely injured patients, and improved preoperative planning, postoperative follow-up, and detailed analysis of causes for chronic residual pain. CT further identified significant soft tissue injuries such as peroneal tendon displacement which cannot be delineated on plain films. (orig.)

  10. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  11. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    -groups is that, the performance of the entire process can be evaluated from the contributions of the individual process-groups towards the selected flowsheet property (for example, energy consumed). The developed flowsheet property models include energy consumption, carbon footprint, product recovery, product......Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...

  12. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  13. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  14. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  15. Analysis of CERN computing infrastructure and monitoring data

    Science.gov (United States)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  16. Computer-aided analysis of cutting processes for brittle materials

    Science.gov (United States)

    Ogorodnikov, A. I.; Tikhonov, I. N.

    2017-12-01

    This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.

  17. Compendium of computer codes for the safety analysis of LMFBR's

    International Nuclear Information System (INIS)

    1975-06-01

    A high level of mathematical sophistication is required in the safety analysis of LMFBR's to adequately meet the demands for realism and confidence in all areas of accident consequence evaluation. The numerical solution procedures associated with these analyses are generally so complex and time consuming as to necessitate their programming into computer codes. These computer codes have become extremely powerful tools for safety analysis, combining unique advantages in accuracy, speed and cost. The number, diversity and complexity of LMFBR safety codes in the U. S. has grown rapidly in recent years. It is estimated that over 100 such codes exist in various stages of development throughout the country. It is inevitable that such a large assortment of codes will require rigorous cataloguing and abstracting to aid individuals in identifying what is available. It is the purpose of this compendium to provide such a service through the compilation of code summaries which describe and clarify the status of domestic LMFBR safety codes. (U.S.)

  18. Computational design analysis for deployment of cardiovascular stents

    International Nuclear Information System (INIS)

    Tammareddi, Sriram; Sun Guangyong; Li Qing

    2010-01-01

    Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.

  19. Computer enhanced release scenario analysis for a nuclear waste repository

    International Nuclear Information System (INIS)

    Stottlemyre, J.A.; Petrie, G.M.; Mullen, M.F.

    1979-01-01

    An interactive (user-oriented) computer tool is being developed at PNL to assist in the analysis of release scenarios for long-term safety assessment of a continental geologic nuclear waste repository. Emphasis is on characterizing the various ways the geologic and hydrologic system surrounding a repository might vary over the 10 6 to 10 7 years subsequent to final closure of the cavern. The potential disruptive phenomena are categorized as natural geologic and man-caused and tend to be synergistic in nature. The computer tool is designed to permit simulation of the system response as a function of the ongoing disruptive phenomena and time. It is designed to be operated in a determinatic manner, i.e., user selection of the desired scenarios and associated rate, magnitude, and lag time data; or in a stochastic mode. The stochastic mode involves establishing distributions for individual phenomena occurrence probabilities, rates, magnitudes, and phase relationships. A Monte-Carlo technique is then employed to generate a multitude of disruptive event scenarios, scan for breaches of the repository isolation, and develop input to the release consequence analysis task. To date, only a simplified one-dimensional version of the code has been completed. Significant modification and development is required to expand its dimensionality and apply the tool to any specific site

  20. A compendium of computer codes in fault tree analysis

    International Nuclear Information System (INIS)

    Lydell, B.

    1981-03-01

    In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)

  1. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  2. Nuclear power reactor analysis, methods, algorithms and computer programs

    International Nuclear Information System (INIS)

    Matausek, M.V

    1981-01-01

    Full text: For a developing country buying its first nuclear power plants from a foreign supplier, disregarding the type and scope of the contract, there is a certain number of activities which have to be performed by local stuff and domestic organizations. This particularly applies to the choice of the nuclear fuel cycle strategy and the choice of the type and size of the reactors, to bid parameters specification, bid evaluation and final safety analysis report evaluation, as well as to in-core fuel management activities. In the Nuclear Engineering Department of the Boris Kidric Institute of Nuclear Sciences (NET IBK) the continual work is going on, related to the following topics: cross section and resonance integral calculations, spectrum calculations, generation of group constants, lattice and cell problems, criticality and global power distribution search, fuel burnup analysis, in-core fuel management procedures, cost analysis and power plant economics, safety and accident analysis, shielding problems and environmental impact studies, etc. The present paper gives the details of the methods developed and the results achieved, with the particular emphasis on the NET IBK computer program package for the needs of planning, construction and operation of nuclear power plants. The main problems encountered so far were related to small working team, lack of large and powerful computers, absence of reliable basic nuclear data and shortage of experimental and empirical results for testing theoretical models. Some of these difficulties have been overcome thanks to bilateral and multilateral cooperation with developed countries, mostly through IAEA. It is the authors opinion, however, that mutual cooperation of developing countries, having similar problems and similar goals, could lead to significant results. Some activities of this kind are suggested and discussed. (author)

  3. Analysis of parallel computing performance of the code MCNP

    International Nuclear Information System (INIS)

    Wang Lei; Wang Kan; Yu Ganglin

    2006-01-01

    Parallel computing can reduce the running time of the code MCNP effectively. With the MPI message transmitting software, MCNP5 can achieve its parallel computing on PC cluster with Windows operating system. Parallel computing performance of MCNP is influenced by factors such as the type, the complexity level and the parameter configuration of the computing problem. This paper analyzes the parallel computing performance of MCNP regarding with these factors and gives measures to improve the MCNP parallel computing performance. (authors)

  4. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  5. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  6. Markov analysis of different standby computer based systems

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Mohan, Nalini; Ghadge, S.G.; Bajaj, S.S.

    2006-01-01

    As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR

  7. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  8. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  9. Computational Analysis of the G-III Laminar Flow Glove

    Science.gov (United States)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  10. National survey on dose data analysis in computed tomography.

    Science.gov (United States)

    Heilmaier, Christina; Treier, Reto; Merkle, Elmar Max; Alkhadi, Hatem; Weishaupt, Dominik; Schindera, Sebastian

    2018-05-28

    A nationwide survey was performed assessing current practice of dose data analysis in computed tomography (CT). All radiological departments in Switzerland were asked to participate in the on-line survey composed of 19 questions (16 multiple choice, 3 free text). It consisted of four sections: (1) general information on the department, (2) dose data analysis, (3) use of a dose management software (DMS) and (4) radiation protection activities. In total, 152 out of 241 Swiss radiological departments filled in the whole questionnaire (return rate, 63%). Seventy-nine per cent of the departments (n = 120/152) analyse dose data on a regular basis with considerable heterogeneity in the frequency (1-2 times per year, 45%, n = 54/120; every month, 35%, n = 42/120) and method of analysis. Manual analysis is carried out by 58% (n = 70/120) compared with 42% (n = 50/120) of departments using a DMS. Purchase of a DMS is planned by 43% (n = 30/70) of the departments with manual analysis. Real-time analysis of dose data is performed by 42% (n = 21/50) of the departments with a DMS; however, residents can access the DMS in clinical routine only in 20% (n = 10/50) of the departments. An interdisciplinary dose team, which among other things communicates dose data internally (63%, n = 76/120) and externally, is already implemented in 57% (n = 68/120) departments. Swiss radiological departments are committed to radiation safety. However, there is high heterogeneity among them regarding the frequency and method of dose data analysis as well as the use of DMS and radiation protection activities. • Swiss radiological departments are committed to and interest in radiation safety as proven by a 63% return rate of the survey. • Seventy-nine per cent of departments analyse dose data on a regular basis with differences in the frequency and method of analysis: 42% use a dose management software, while 58% currently perform manual dose data analysis. Of the latter, 43% plan to buy a dose

  11. Computer codes for the analysis of flask impact problems

    International Nuclear Information System (INIS)

    Neilson, A.J.

    1984-09-01

    This review identifies typical features of the design of transportation flasks and considers some of the analytical tools required for the analysis of impact events. Because of the complexity of the physical problem, it is unlikely that a single code will adequately deal with all the aspects of the impact incident. Candidate codes are identified on the basis of current understanding of their strengths and limitations. It is concluded that the HONDO-II, DYNA3D AND ABAQUS codes which ar already mounted on UKAEA computers will be suitable tools for use in the analysis of experiments conducted in the proposed AEEW programme and of general flask impact problems. Initial attention should be directed at the DYNA3D and ABAQUS codes with HONDO-II being reserved for situations where the three-dimensional elements of DYNA3D may provide uneconomic simulations in planar or axisymmetric geometries. Attention is drawn to the importance of access to suitable mesh generators to create the nodal coordinate and element topology data required by these structural analysis codes. (author)

  12. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.

  13. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-01-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique

  14. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-11-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique

  15. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  16. Computer-aided target tracking in motion analysis studies

    Science.gov (United States)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  17. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  18. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  19. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  20. Computational Fluid Dynamics Analysis of an Evaporative Cooling System

    Directory of Open Access Journals (Sweden)

    Kapilan N.

    2016-11-01

    Full Text Available The use of chlorofluorocarbon based refrigerants in the air-conditioning system increases the global warming and causes the climate change. The climate change is expected to present a number of challenges for the built environment and an evaporative cooling system is one of the simplest and environmentally friendly cooling system. The evaporative cooling system is most widely used in summer and in rural and urban areas of India for human comfort. In evaporative cooling system, the addition of water into air reduces the temperature of the air as the energy needed to evaporate the water is taken from the air. Computational fluid dynamics is a numerical analysis and was used to analyse the evaporative cooling system. The CFD results are matches with the experimental results.

  1. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  2. TEABAGS: computer programs for instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, D J [Washington Univ., St. Louis, MO (USA); Korotev, R L [Washington Univ., St. Louis, MO (USA). McDonnell Center for the Space Sciences

    1982-01-01

    Described is a series of INAA data reduction programs collectively known as TEABAGS (Trace Element Analysis By Automated Gamma-ray Spectrometry). The programs are written in FORTRAN and run on a Nuclear Data ND-6620 computer system, but should be adaptable to any medium-sized minicomputer. They are designed to monitor the status of all spectra obtained from samples and comparison standards irradiated together and to do all pending calculations without operator intervention. Major emphasis is placed on finding all peaks in the spectrum, properly identifying all nuclides present and all contributors to each peak, determining accurate estimates of the background continua under peaks, and producing realistic uncertainties on peak areas and final abundances.

  3. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-01-01

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  4. Analysis of 3D crack propagation by microfocus computed tomography

    International Nuclear Information System (INIS)

    Ao Bo; Chen Fuxing; Deng Cuizhen; Zeng Yabin

    2014-01-01

    The three-point bending test of notched specimens of 2A50 forging aluminum was performed by high frequency fatigue tester, and the surface cracks of different stages were analyzed and contrasted by SEM. The crack was reconstructed by microfocus computed tomography, and its size, position and distribution were visually displayed through 3D visualization. The crack propagation behaviors were researched through gray value and position of crack front of 2D CT images in two adjacent stages, and the results show that crack propagation is irregular. The projection image of crack was obtained if crack of two stages projected onto the reference plane respectively, a significant increase of new crack propagation was observed compared with the previous projection of crack, and the distribution curve of crack front of two stages was displayed. The 3D increment distribution of the crack front propagation was obtained through the 3D crack analysis of two stages. (authors)

  5. Satellite interference analysis and simulation using personal computers

    Science.gov (United States)

    Kantak, Anil

    1988-03-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  7. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time

  8. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  9. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  10. CO-Bridged H-Cluster Intermediates in the Catalytic Mechanism of [FeFe]-Hydrogenase CaI.

    Science.gov (United States)

    Ratzloff, Michael W; Artz, Jacob H; Mulder, David W; Collins, Reuben T; Furtak, Thomas E; King, Paul W

    2018-06-20

    The [FeFe]-hydrogenases ([FeFe] H 2 ases) catalyze reversible H 2 activation at the H-cluster, which is composed of a [4Fe-4S] H subsite linked by a cysteine thiolate to a bridged, organometallic [2Fe-2S] ([2Fe] H ) subsite. Profoundly different geometric models of the H-cluster redox states that orchestrate the electron/proton transfer steps of H 2 bond activation have been proposed. We have examined this question in the [FeFe] H 2 ase I from Clostridium acetobutylicum (CaI) by Fourier-transform infrared (FTIR) spectroscopy with temperature annealing and H/D isotope exchange to identify the relevant redox states and define catalytic transitions. One-electron reduction of H ox led to formation of H red H + ([4Fe-4S] H 2+ -Fe I -Fe I ) and H red ' ([4Fe-4S] H 1+ -Fe II -Fe I ), with both states characterized by low frequency μ-CO IR modes consistent with a fully bridged [2Fe] H . Similar μ-CO IR modes were also identified for H red H + of the [FeFe] H 2 ase from Chlamydomonas reinhardtii (CrHydA1). The CaI proton-transfer variant C298S showed enrichment of an H/D isotope-sensitive μ-CO mode, a component of the hydride bound H-cluster IR signal, H hyd . Equilibrating CaI with increasing amounts of NaDT, and probed at cryogenic temperatures, showed H red H + was converted to H hyd . Over an increasing temperature range from 10 to 260 K catalytic turnover led to loss of H hyd and appearance of H ox , consistent with enzymatic turnover and H 2 formation. The results show for CaI that the μ-CO of [2Fe] H remains bridging for all of the "H red " states and that H red H + is on pathway to H hyd and H 2 evolution in the catalytic mechanism. These results provide a blueprint for designing small molecule catalytic analogs.

  11. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  12. CaI and SrI molecules for iodine determination by high-resolution continuum source graphite furnace molecular absorption spectrometry: Greener molecules for practical application.

    Science.gov (United States)

    Zanatta, Melina Borges Teixeira; Nakadi, Flávio Venâncio; da Veiga, Márcia Andreia Mesquita Silva

    2018-03-01

    A new method to determine iodine in drug samples by high-resolution continuum source graphite furnace molecular absorption spectrometry (HR-CS GF MAS) has been developed. The method measures the molecular absorption of a diatomic molecule, CaI or SrI (less toxic molecule-forming reagents), at 638.904 or 677.692nm, respectively, and uses a mixture containing 5μg of Pd and 0.5μg of Mg as chemical modifier. The method employs pyrolysis temperatures of 1000 and 800°C and vaporization temperatures of 2300 and 2400°C for CaI and SrI, respectively. The optimized amounts of Ca and Sr as molecule-forming reagents are 100 and 150µg, respectively. On the basis of interference studies, even small chlorine concentrations reduce CaI and SrI absorbance significantly. The developed method was used to analyze different commercial drug samples, namely thyroid hormone pills with three different iodine amounts (15.88, 31.77, and 47.66µg) and one liquid drug with 1% m v -1 active iodine in their compositions. The results agreed with the values informed by the manufacturers (95% confidence level) regardless of whether CaI or SrI was determined. Therefore, the developed method is useful for iodine determination on the basis of CaI or SrI molecular absorption. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Computer assisted analysis of medical x-ray images

    Science.gov (United States)

    Bengtsson, Ewert

    1996-01-01

    X-rays were originally used to expose film. The early computers did not have enough capacity to handle images with useful resolution. The rapid development of computer technology over the last few decades has, however, led to the introduction of computers into radiology. In this overview paper, the various possible roles of computers in radiology are examined. The state of the art is briefly presented, and some predictions about the future are made.

  14. Integrated severe accident containment analysis with the CONTAIN computer code

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.

    1985-12-01

    Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite

  15. Reliability of Computer Analysis of Electrocardiograms (ECG) of ...

    African Journals Online (AJOL)

    Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...

  16. RADTRAN 5 - A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1993-01-01

    The RADTRAN 5 computer code has been developed to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI standard FORTRAN 77; the code contains significant advances in the methodology first pioneered with the LINK option of RADTRAN 4. A major application of the LINK methodology is route-specific analysis. Another application is comparisons of attributes along the same route segments. Nonradiological risk factors have been incorporated to allow users to estimate nonradiological fatalities and injuries that might occur during the transportation event(s) being analyzed. These fatalities include prompt accidental fatalities from mechanical causes. Values of these risk factors for the United States have been made available in the code as optional defaults. Several new health effects models have been published in the wake of the Hiroshima-Nagasaki dosimetry reassessment, and this has emphasized the need for flexibility in the RADTRAN approach to health-effects calculations. Therefore, the basic set of health-effects conversion equations in RADTRAN have been made user-definable. All parameter values can be changed by the user, but a complete set of default values are available for both the new International Commission on Radiation Protection model (ICRP Publication 60) and the recent model of the U.S. National Research Council's Committee on the Biological Effects of Radiation (BEIR V). The meteorological input data tables have been modified to permit optional entry of maximum downwind distances for each dose isopleth. The expected dose to an individual in each isodose area is also calculated and printed automatically. Examples are given that illustrate the power and flexibility of the RADTRAN 5 computer code. (J.P.N.)

  17. Genome Assembly and Computational Analysis Pipelines for Bacterial Pathogens

    KAUST Repository

    Rangkuti, Farania Gama Ardhina

    2011-06-01

    Pathogens lie behind the deadliest pandemics in history. To date, AIDS pandemic has resulted in more than 25 million fatal cases, while tuberculosis and malaria annually claim more than 2 million lives. Comparative genomic analyses are needed to gain insights into the molecular mechanisms of pathogens, but the abundance of biological data dictates that such studies cannot be performed without the assistance of computational approaches. This explains the significant need for computational pipelines for genome assembly and analyses. The aim of this research is to develop such pipelines. This work utilizes various bioinformatics approaches to analyze the high-­throughput genomic sequence data that has been obtained from several strains of bacterial pathogens. A pipeline has been compiled for quality control for sequencing and assembly, and several protocols have been developed to detect contaminations. Visualization has been generated of genomic data in various formats, in addition to alignment, homology detection and sequence variant detection. We have also implemented a metaheuristic algorithm that significantly improves bacterial genome assemblies compared to other known methods. Experiments on Mycobacterium tuberculosis H37Rv data showed that our method resulted in improvement of N50 value of up to 9697% while consistently maintaining high accuracy, covering around 98% of the published reference genome. Other improvement efforts were also implemented, consisting of iterative local assemblies and iterative correction of contiguated bases. Our result expedites the genomic analysis of virulent genes up to single base pair resolution. It is also applicable to virtually every pathogenic microorganism, propelling further research in the control of and protection from pathogen-­associated diseases.

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  19. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  20. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  1. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  2. COMTA - a computer code for fuel mechanical and thermal analysis

    International Nuclear Information System (INIS)

    Basu, S.; Sawhney, S.S.; Anand, A.K.; Anantharaman, K.; Mehta, S.K.

    1979-01-01

    COMTA is a generalized computer code for integrity analysis of the free standing fuel cladding, with natural UO 2 or mixed oxide fuel pellets. Thermal and Mechanical analysis is done simultaneously for any power history of the fuel pin. For analysis, the fuel cladding is assumed to be axisymmetric and is subjected to axisymmetric load due to contact pressure, gas pressure, coolant pressure and thermal loads. Axial variation of load is neglected and creep and plasticity are assumed to occur at constant volume. The pellet is assumed to be made of concentric annuli. The fission gas release integral is dependent on the temperature and the power produced in each annulus. To calculate the temperature distribution in the fuel pin, the variation of bulk coolant temperature is given as an input to the code. Gap conductance is calculated at every time step, considering fuel densification, fuel relocation and gap closure, filler gas dilution by released fission gas, gap closure by expansion and irradiation swelling. Overall gap conductance is contributed by heat transfer due to the three modes; conduction convection and radiation as per modified Ross and Stoute model. Equilibrium equations, compatibility equations, stress strain relationships (including thermal strains and permanent strains due to creep and plasticity) are used to obtain triaxial stresses and strains. Thermal strain is assumed to be zero at hot zero power conditions. The boundary conditions are obtained for radial stresses at outside and inside surfaces by making these equal to coolant pressure and internal pressure respectively. A multi-mechanism creep model which accounts for thermal and irradiation creep is used to calculate the overall creep rate. Effective plastic strain is a function of effective stress and material constants. (orig.)

  3. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Science.gov (United States)

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  4. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Directory of Open Access Journals (Sweden)

    Kevin S Bonham

    2017-10-01

    Full Text Available While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  5. Trident: scalable compute archives: workflows, visualization, and analysis

    Science.gov (United States)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  6. Computer analysis and comparison of chess players' game-playing styles

    OpenAIRE

    Krevs, Urša

    2015-01-01

    Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...

  7. Performance analysis of cloud computing services for many-tasks scientific computing

    NARCIS (Netherlands)

    Iosup, A.; Ostermann, S.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.

    2011-01-01

    Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike. Through the use of virtualization and resource time sharing, clouds serve with a single set of physical resources a

  8. A performance analysis of EC2 cloud computing services for scientific computing

    NARCIS (Netherlands)

    Ostermann, S.; Iosup, A.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.; Avresky, D.; Diaz, M.; Bode, A.; Bruno, C.; Dekel, E.

    2010-01-01

    Cloud Computing is emerging today as a commercial infrastructure that eliminates the need for maintaining expensive computing hardware. Through the use of virtualization, clouds promise to address with the same shared set of physical resources a large user base with different needs. Thus, clouds

  9. Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.

    Science.gov (United States)

    Edwards, Thomas O.

    The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…

  10. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  11. Computational analysis on plug-in hybrid electric motorcycle chassis

    Science.gov (United States)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  12. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    Rocca, H.C.

    1976-10-01

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author) [es

  13. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  14. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  15. COMPUTATIONAL ANALYSIS OF BACKWARD-FACING STEP FLOW

    Directory of Open Access Journals (Sweden)

    Erhan PULAT

    2001-01-01

    Full Text Available In this study, backward-facing step flow that are encountered in electronic systems cooling, heat exchanger design, and gas turbine cooling are investigated computationally. Steady, incompressible, and two-dimensional air flow is analyzed. Inlet velocity is assumed uniform and it is obtained from parabolic profile by using maximum velocity. In the analysis, the effects of channel expansion ratio and Reynolds number to reattachment length are investigated. In addition, pressure distribution throughout the channel length is also obtained and flow is analyzed for the Reynolds number values of 50 and 150 and channel expansion ratios of 1.5 and 2. Governing equations are solved by using Galerkin finite element mothod of ANSYS-FLOTRAN code. Obtained results are compared with the solutions of lattice BGK method that is relatively new method in fluid dynamics and other numerical and experimental results. It is concluded that reattachment length increases with increasing Reynolds number and at the same Reynolds number it decreases with increasing channel expansion ratio.

  16. Design of airborne wind turbine and computational fluid dynamics analysis

    Science.gov (United States)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.

  17. A computer language for reducing activation analysis data

    International Nuclear Information System (INIS)

    Friedman, M.H.; Tanner, J.T.

    1978-01-01

    A program, written in FORTRAN, which defines a language for reducing activation analysis data is described. An attempt was made to optimize the choice of commands and their definitions so as to concisely express what should be done, make the language natural to use and easy to learn, arranqe a system of checks to guard against communication errors and have the language be inclusive. Communications are effected through commands, and these can be given in almost any order. Consistency checks are done and diagnostic messages are printed automatically to guard against the incorrect use of commands. Default options on the commands allow instructions to be expressed concisely while providing a capability to specify details for the data reduction process. The program has been implemented on a UNIVAC 1108 computer. A complete description of the commands, the algorithms used, and the internal consistency checks used are given elsewhere. The applications of the program and the methods for obtaining data automatically have already been described. (T.G.)

  18. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  19. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  20. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  1. Analisis cualitativo asistido por computadora Computer-assisted qualitative analysis

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2003-01-01

    Full Text Available Los objetivos de este ensayo son: por un lado, presentar una aproximación a la experiencia hispanoamericana en el Análisis Cualitativo Asistido por Computadora (ACAC al agrupar mediante un ejercicio de sistematización los trabajos realizados por diversos colegas provenientes de disciplinas afines. Aunque hubiese querido ser exhaustivo y minucioso, como cualquier intento de sistematización de experiencias, en este ejercicio son notables las ausencias y las omisiones. Introducir algunas reflexiones teóricas en torno al papel del ACAC en el desarrollo de la investigación cualitativa a partir de esa sistematización y con particular énfasis en la producción del dato es, por otro lado, objetivo central de esta primera aproximación.The aims of this article are: on the one hand, to present an approximation to the Hispano-American experience on Computer-Assisted Qualitative Data Analysis (CAQDAS, grouping as a systematization exercise the works carried out by several colleagues from related disciplines. Although attempting to be exhaustive and thorough - as in any attempt at systematizing experiences - this exercise presents clear lacks and omissions. On the other hand, to introduce some theoretical reflections about the role played by CAQDAS in the development of qualitative investigation after that systematization, with a specific focus on data generation.

  2. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  3. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  4. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  5. Clinical diagnosis and computer analysis of headache symptoms.

    OpenAIRE

    Drummond, P D; Lance, J W

    1984-01-01

    The headache histories obtained from clinical interviews of 600 patients were analysed by computer to see whether patients could be separated systematically into clinical categories and to see whether sets of symptoms commonly reported together differed in distribution among the categories. The computer classification procedure assigned 537 patients to the same category as their clinical diagnosis, the majority of discrepancies between clinical and computer classifications involving common mi...

  6. Analysis of school furniture used in computer classrooms

    OpenAIRE

    Jiří Tauber

    2011-01-01

    With the respect to the fast development of new computer technologies, it is unconditionally necessary that school furniture reflected this trend and adapted to it. Our use of computer technologies and utilities in teaching is increasing. Therefore, it is necessary to improve school desks so that they would be fit for new computer technology. Creation of a compact set of information relative to the issue concerned, which would comprise of needs and requirements for individual pieces of furnit...

  7. Formal Specification and Analysis of Cloud Computing Management

    Science.gov (United States)

    2012-01-24

    te r Cloud Computing in a Nutshell We begin this introduction to Cloud Computing with a famous quote by Larry Ellison: “The interesting thing about...the wording of some of our ads.” — Larry Ellison, Oracle CEO [106] In view of this statement, we summarize the essential aspects of Cloud Computing...1] M. Abadi, M. Burrows , M. Manasse, and T. Wobber. Moderately hard, memory-bound functions. ACM Transactions on Internet Technology, 5(2):299–327

  8. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  9. Computer Models for IRIS Control System Transient Analysis

    International Nuclear Information System (INIS)

    Gary D Storrick; Bojan Petrovic; Luca Oriani

    2007-01-01

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section

  10. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    Science.gov (United States)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  11. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  12. If You Meet the Computer Guru on the Road, Kill Him (or Her).

    Science.gov (United States)

    Gore, Kay

    1989-01-01

    Discusses problems and misconceptions concerning the appropriate use of computers in K-12 classrooms. The use of software to support computer-assisted instruction (CAI) is described, teacher-written software is discussed, telecommunications issues are considered, and the role of administrators and teachers is examined. (two references) (LRW)

  13. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  14. Effects of Using Simultaneous Prompting and Computer-Assisted Instruction during Small Group Instruction

    Science.gov (United States)

    Ozen, Arzu; Ergenekon, Yasemin; Ulke-Kurkcuoglu, Burcu

    2017-01-01

    The current study investigated the relation between simultaneous prompting (SP), computer-assisted instruction (CAI), and the receptive identification of target pictures (presented on laptop computer) for four preschool students with developmental disabilities. The students' acquisition of nontarget information through observational learning also…

  15. Comparative Analysis on the Utilization of Computers | Nkata ...

    African Journals Online (AJOL)

    The findings reveal among others that extent of usability of computers in the two universities had a significant difference. It was concluded that the level of computer utilization in UNIPORT is more than in the RUST. It was recommended that periodical, pre and post qualification seminars be organized for the 2 university ...

  16. From handwriting analysis to pen-computer applications

    NARCIS (Netherlands)

    Schomaker, L

    1998-01-01

    In this paper, pen computing, i.e. the use of computers and applications in which the pen is the main input device, will be described from four different viewpoints. Firstly a brief overview of the hardware developments in pen systems is given, leading to the conclusion that the technological

  17. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  18. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  19. High throughput computing: a solution for scientific analysis

    Science.gov (United States)

    O'Donnell, M.

    2011-01-01

    Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:

  20. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  1. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  2. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  3. Use of electronic computers for processing of spectrometric data in instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.

    1977-01-01

    A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered

  4. Report--COMOLA: A Computer System for the Analysis of Interlanguage Data.

    Science.gov (United States)

    Jagtman, Margriet; Bongaerts, Theo

    1994-01-01

    Discusses the design and use of the Computer Model for Language Acquisition (COMOLA), a computer program designed to analyze syntactic development in second-language learners by examining their oral utterances. Also compares COMOLA to the recently developed Computer-Aides Linguistic Analysis (COALA) program. (MDM)

  5. Interface design of VSOP'94 computer code for safety analysis

    International Nuclear Information System (INIS)

    Natsir, Khairina; Andiwijayakusuma, D.; Wahanani, Nursinta Adi; Yazid, Putranto Ilham

    2014-01-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects

  6. Interface design of VSOP'94 computer code for safety analysis

    Science.gov (United States)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  7. Computational identification and analysis of novel sugarcane microRNAs

    Directory of Open Access Journals (Sweden)

    Thiebaut Flávia

    2012-07-01

    Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.

  8. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  9. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  10. Multimedia Image Technology and Computer Aided Manufacturing Engineering Analysis

    Science.gov (United States)

    Nan, Song

    2018-03-01

    Since the reform and opening up, with the continuous development of science and technology in China, more and more advanced science and technology have emerged under the trend of diversification. Multimedia imaging technology, for example, has a significant and positive impact on computer aided manufacturing engineering in China. From the perspective of scientific and technological advancement and development, the multimedia image technology has a very positive influence on the application and development of computer-aided manufacturing engineering, whether in function or function play. Therefore, this paper mainly starts from the concept of multimedia image technology to analyze the application of multimedia image technology in computer aided manufacturing engineering.

  11. Calcium-aluminum-rich inclusions with fractionation and unidentified nuclear effects (FUN CAIs): II. Heterogeneities of magnesium isotopes and 26Al in the early Solar System inferred from in situ high-precision magnesium-isotope measurements

    Science.gov (United States)

    Park, Changkun; Nagashima, Kazuhide; Krot, Alexander N.; Huss, Gary R.; Davis, Andrew M.; Bizzarro, Martin

    2017-03-01

    Calcium-aluminum-rich inclusions with isotopic mass fractionation effects and unidentified nuclear isotopic anomalies (FUN CAIs) have been studied for more than 40 years, but their origins remain enigmatic. Here we report in situ high precision measurements of aluminum-magnesium isotope systematics of FUN CAIs by secondary ion mass spectrometry (SIMS). Individual minerals were analyzed in six FUN CAIs from the oxidized CV3 carbonaceous chondrites Axtell (compact Type A CAI Axtell 2271) and Allende (Type B CAIs C1 and EK1-4-1, and forsterite-bearing Type B CAIs BG82DH8, CG-14, and TE). Most of these CAIs show evidence for excess 26Mg due to the decay of 26Al. The inferred initial 26Al/27Al ratios [(26Al/27Al)0] and the initial magnesium isotopic compositions (δ26Mg0) calculated using an exponential law with an exponent β of 0.5128 are (3.1 ± 1.6) × 10-6 and 0.60 ± 0.10‰ (Axtell 2271), (3.7 ± 1.5) × 10-6 and -0.20 ± 0.05‰ (BG82DH8), (2.2 ± 1.1) × 10-6 and -0.18 ± 0.05‰ (C1), (2.3 ± 2.4) × 10-5 and -2.23 ± 0.37‰ (EK1-4-1), (1.5 ± 1.1) × 10-5 and -0.42 ± 0.08‰ (CG-14), and (5.3 ± 0.9) × 10-5 and -0.05 ± 0.08‰ (TE) with 2σ uncertainties. We infer that FUN CAIs recorded heterogeneities of magnesium isotopes and 26Al in the CAI-forming region(s). Comparison of 26Al-26Mg systematics, stable isotope (oxygen, magnesium, calcium, and titanium) and trace element studies of FUN and non-FUN igneous CAIs indicates that there is a continuum among these CAI types. Based on these observations and evaporation experiments on CAI-like melts, we propose a generic scenario for the origin of igneous (FUN and non-FUN) CAIs: (i) condensation of isotopically normal solids in an 16O-rich gas of approximately solar composition; (ii) formation of CAI precursors by aggregation of these solids together with variable abundances of isotopically anomalous grains-possible carriers of unidentified nuclear (UN) effects; and (iii) melt evaporation of these precursors

  12. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  13. Sensitivity Analysis and Error Control for Computational Aeroelasticity, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposal is the development of a next-generation computational aeroelasticity code, suitable for real-world complex geometries, and...

  14. Cloud computing for genomic data analysis and collaboration.

    Science.gov (United States)

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  15. Cost/Benefit Analysis of Leasing Versus Purchasing Computers

    National Research Council Canada - National Science Library

    Arceneaux, Alan

    1997-01-01

    .... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...

  16. Discrete calculus applied analysis on graphs for computational science

    CERN Document Server

    Grady, Leo J

    2010-01-01

    This unique text brings together into a single framework current research in the three areas of discrete calculus, complex networks, and algorithmic content extraction. Many example applications from several fields of computational science are provided.

  17. Computational Analysis of Flow Through a Transonic Compressor Rotor

    National Research Council Canada - National Science Library

    Bochette, Nikolaus J

    2005-01-01

    .... In examining this problem two Computational Fluid Dynamic (CFD) codes have been used by the Naval Postgraduate School to predict the performance of a transonic compressor rotor that is being tested with steam ingestion...

  18. Automated computation of autonomous spectral submanifolds for nonlinear modal analysis

    Science.gov (United States)

    Ponsioen, Sten; Pedergnana, Tiemo; Haller, George

    2018-04-01

    We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.

  19. Hierarchical nanoreinforced composites: Computational analysis of damage mechanisms

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Pontefisso, Alessandro; Dai, Gaoming

    2016-01-01

    of distribution, shape, orientation of nanoparticles (carbon nanotube, graphene) in unidirectional polymer matrix composites on the strength and damage resistance of the composites is studied in computational studies. The possible directions of the improvement of nanoreinforced composites by controlling shapes...

  20. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  1. Computational analysis of difenoconazole interaction with soil chitinases

    International Nuclear Information System (INIS)

    Vlǎdoiu, D L; Filimon, M N; Ostafe, V; Isvoran, A

    2015-01-01

    This study focusses on the investigation of the potential binding of the fungicide difenoconazole to soil chitinases using a computational approach. Computational characterization of the substrate binding sites of Serratia marcescens and Bacillus cereus chitinases using Fpocket tool reflects the role of hydrophobic residues for the substrate binding and the high local hydrophobic density of both sites. Molecular docking study reveals that difenoconazole is able to bind to Serratia marcescens and Bacillus cereus chitinases active sites, the binding energies being comparable

  2. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  3. Heat Transfer treatment in computer codes for safety analysis

    International Nuclear Information System (INIS)

    Jerele, A.; Gregoric, M.

    1984-01-01

    Increased number of operating nuclear power plants has stressed importance of nuclear safety evaluation. For this reason, accordingly to regulatory commission request, safety analyses with computer codes are preformed. In this paper part of this thermohydraulic models dealing with wall-to-fluid heat transfer correlations in computer codes TRAC=PF1, RELAP4/MOD5, RELAP5/MOD1 and COBRA-IV is discussed. (author)

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  5. Analysis of Sci-Hub downloads of computer science papers

    Directory of Open Access Journals (Sweden)

    Andročec Darko

    2017-07-01

    Full Text Available The scientific knowledge is disseminated by research papers. Most of the research literature is copyrighted by publishers and avail- able only through paywalls. Recently, some websites offer most of the recent content for free. One of them is the controversial website Sci-Hub that enables access to more than 47 million pirated research papers. In April 2016, Science Magazine published an article on Sci-Hub activity over the period of six months and publicly released the Sci-Hub’s server log data. The mentioned paper aggregates the view that relies on all downloads and for all fields of study, but these findings might be hiding interesting patterns within computer science. The mentioned Sci-Hub log data was used in this paper to analyse downloads of computer science papers based on DBLP’s list of computer science publications. The top downloads of computer science papers were analysed, together with the geographical location of Sci-Hub users, the most downloaded publishers, types of papers downloaded, and downloads of computer science papers per publication year. The results of this research can be used to improve legal access to the most relevant scientific repositories or journals for the computer science field.

  6. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  7. Perceptions on hospitality when visiting secluded communities of guaranis, caiçaras e quilombolas in Paraty region

    Directory of Open Access Journals (Sweden)

    Luis Alberto Beares

    2008-10-01

    Full Text Available Tourism in secluded communities puts different cultures in contact with each other and must be handled carefully not to cause environmental damage as well as cultural loss which might jeopardize the local development and create hostile relationships. The proposal of in sito tourism, considering the local memory and patrimony as a hospitality potential, was observed during technical visitations to three communities located in the Paraty region and surroundings: Guarani, Caiçara (fishermen and Quilombola(African slaves descendants. Through field work involving visitations to communities and interviews with locals, information regarding cultural differences and the importance of the land occupation in the history of each of the communities was assessed. The common link in the history of these peoples is the struggle for the right of land possession. During visits when people shared their territory various forms of hospitality in each community were verified, issued from different cultures and cultural values.

  8. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  9. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  10. 《采薇》的叙事伦理分析%Narrative Ethics of Cai Wei

    Institute of Scientific and Technical Information of China (English)

    王海燕

    2014-01-01

    The controversial dilemma of how to understand Boyi, Shuqi in Luxun’ s novel Cai Wei, it is not a realistic ethical problem, which should be based on the novel's narrative ethics to answer. Narrative ethics refers to various forms of arrangement of ethical dimension revealed. Novel manifests sympathy ethics to Boyi, Shu Qi through the narrative angle choice and narrative distance controlling. Also, by the counterpoint to character, event and ironic expression in structure, novel conveys the ethics excluding to all sorts of characters from emperor Zhouwu to A Jin. Compared to other “Generational change” theme novels of Luxun, the complication of narrative ethics of Cai Wei not only is the projection of reality, but also reflects the further conscious to narrative art of Luxun.%《采薇》中颇有争议的“如何理解伯夷、叔齐”这一难点,并不是一个现实伦理问题,而应依据小说的叙事伦理来解答。叙事伦理即各种形式安排透露出的伦理维度。小说通过叙事角度的选择与叙事距离的控制体现出对伯夷、叔齐的伦理同情,而由结构上的人物对位与事件对位及反讽表达的是对自周武王至阿金诸色人物的伦理拒斥。与前期同是“易代”主题的小说相比,《采薇》叙事伦理的复杂化既是现实的投射,也体现了鲁迅对于小说叙事艺术的进一步自觉。

  11. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  12. Nondestructive analysis of urinary calculi using micro computed tomography

    Directory of Open Access Journals (Sweden)

    Lingeman James E

    2004-12-01

    Full Text Available Abstract Background Micro computed tomography (micro CT has been shown to provide exceptionally high quality imaging of the fine structural detail within urinary calculi. We tested the idea that micro CT might also be used to identify the mineral composition of urinary stones non-destructively. Methods Micro CT x-ray attenuation values were measured for mineral that was positively identified by infrared microspectroscopy (FT-IR. To do this, human urinary stones were sectioned with a diamond wire saw. The cut surface was explored by FT-IR and regions of pure mineral were evaluated by micro CT to correlate x-ray attenuation values with mineral content. Additionally, intact stones were imaged with micro CT to visualize internal morphology and map the distribution of specific mineral components in 3-D. Results Micro CT images taken just beneath the cut surface of urinary stones showed excellent resolution of structural detail that could be correlated with structure visible in the optical image mode of FT-IR. Regions of pure mineral were not difficult to find by FT-IR for most stones and such regions could be localized on micro CT images of the cut surface. This was not true, however, for two brushite stones tested; in these, brushite was closely intermixed with calcium oxalate. Micro CT x-ray attenuation values were collected for six minerals that could be found in regions that appeared to be pure, including uric acid (3515 – 4995 micro CT attenuation units, AU, struvite (7242 – 7969 AU, cystine (8619 – 9921 AU, calcium oxalate dihydrate (13815 – 15797 AU, calcium oxalate monohydrate (16297 – 18449 AU, and hydroxyapatite (21144 – 23121 AU. These AU values did not overlap. Analysis of intact stones showed excellent resolution of structural detail and could discriminate multiple mineral types within heterogeneous stones. Conclusions Micro CT gives excellent structural detail of urinary stones, and these results demonstrate the feasibility

  13. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  14. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  15. High-performance computing in accelerating structure design and analysis

    International Nuclear Information System (INIS)

    Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-01-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)

  16. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  17. Heat exchanger performance analysis programs for the personal computer

    International Nuclear Information System (INIS)

    Putman, R.E.

    1992-01-01

    Numerous utility industry heat exchange calculations are repetitive and thus lend themselves to being performed on a Personal Computer. These programs may be regarded as engineering tools which, when put together, can form a Toolbox. However, the practicing Results Engineer in the utility industry desires not only programs that are robust as well as easy to use but can also be used both on desktop and laptop PC's. The latter also offer the opportunity to take the computer into the plant or control room, and use it there to process test or operating data right on the spot. Most programs evolve through the needs which arise in the course of day-to-day work. This paper describes several of the more useful programs of this type and outlines some of the guidelines to be followed when designing personal computer programs for use by the practicing Results Engineer

  18. Low-frequency computational electromagnetics for antenna analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K. (Los Alamos National Lab., NM (USA)); Burke, G.J. (Lawrence Livermore National Lab., CA (USA))

    1991-01-01

    An overview of low-frequency, computational methods for modeling the electromagnetic characteristics of antennas is presented here. The article presents a brief analytical background, and summarizes the essential ingredients of the method of moments, for numerically solving low-frequency antenna problems. Some extensions to the basic models of perfectly conducting objects in free space are also summarized, followed by a consideration of some of the same computational issues that affect model accuracy, efficiency and utility. A variety of representative computations are then presented to illustrate various modeling aspects and capabilities that are currently available. A fairly extensive bibliography is included to suggest further reference material to the reader. 90 refs., 27 figs.

  19. Radiographic test phantom for computed tomographic lung nodule analysis

    International Nuclear Information System (INIS)

    Zerhouni, E.A.

    1987-01-01

    This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs

  20. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...