Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))


    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Computer Assisted Instruction (CAI) in Language Teaching

    Xin; Jing


    There are many ways to use computers for English language teaching.First of all,teachers can use them to prepare for classes.They can use a word processing program to write teaching materials and tests.They can use dictionaries,encyclopedias,et c.,available on the computer as resources to help them prepare

  3. Curriculum planning and computer-assisted instruction (CAI) within clinical nursing education.

    Perciful, E. G.


    Some experts in nursing and computers have stated that the integration of the computer within nursing education needs to be planned. It has also been declared that there is a need for a body of knowledge that describes the planning and implementing of CAI and the degree of success with the implementation of CAI within nursing education. There is a paucity of literature addressing the planning, implementing, and evaluation of CAI within clinical nursing education. The purpose of this paper is ...

  4. The Effect of the Computer Assisted Instruction (CAI on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    Tuğba Hangül


    Full Text Available The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade students who were randomly identified in the year of 2009-2010. Attitude was applied to the both groups before and at the end of teaching. The method of constructivism was applied to control the group while CAI was applied to the experiment group. After teaching, fourteen students who were randomly selected from the experimental group were interviewed. Quantitative data was analyzed using Independent Samples t-test and qualitative data was analyzed by description analyze. At the end of the study, the data put forward that teaching through CAI improves the students’ attitudes positively than the method of Constructivism and students have positive opinions on CAI.

  5. An investigative study into the effectiveness of using computer-aided instruction (CAI) as a laboratory component of college-level biology: A case study

    Barrett, Joan Beverly

    Community colleges serve the most diverse student populations in higher education. They consist of non-traditional, part-time, older, intermittent, and mobile students of different races, ethnic backgrounds, language preferences, physical and mental abilities, and learning style preferences. Students who are academically challenged may have diverse learning characteristics that are not compatible with the more traditional approaches to the delivery of instruction. With this need come new ways of solving the dilemma, such as Computer-aided Instruction (CAI). This case study investigated the use of CAI as a laboratory component of college-level biology in a small, rural community college setting. The intent was to begin to fill a void that seems to exist in the literature regarding the role of the faculty in the development and use of CAI. In particular, the investigator was seeking to understand the practice and its effectiveness, especially in helping the under prepared student. The case study approach was chosen to examine a specific phenomenon within a single institution. Ethnographic techniques, such as interviewing, documentary analysis, life's experiences, and participant observations were used to collect data about the phenomena being studied. Results showed that the faculty was primarily self-motivated and self-taught in their use of CAI as a teaching and learning tool. The importance of faculty leadership and collegiality was evident. Findings showed the faculty confident that expectations of helping students who have difficulties with mathematical concepts have been met and that CAI is becoming the most valuable of learning tools. In a traditional college classroom, or practice, time is the constant (semesters) and competence is the variable. In the CAI laboratory time became the variable and competence the constant. The use of CAI also eliminated hazardous chemicals that were routinely used in the more traditional lab. Outcomes showed that annual savings

  6. The Effect of the Computer Assisted Instruction (CAI) on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    Tuğba Hangül; Devrim Uzel


    The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI) and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade s...




    Full Text Available Thermal patterns of an area which underwent a polyphase deformation history such as the Carnic Alps were analyzed using the Colour Alteration Index (CAI of conodonts in order to constrain some aspects of the metamorphic history of this part of the Southern Alps. Hercynian and alpine tectonothermal events were distinguished using CAI analysis.  The Hercynian event developed temperatures up to low metamorphic conditions. Alpine tectonogenesis did not produce thermal levels in excess of the diagenetic zone. Moreover, CAI patterns allow recognition and evaluation of a hydrothermal metamorphic overprint of Permo-Triassic or Oligocene age that was superimposed on the pre-existing regional metamorphic zonation.   

  8. Personality preference influences medical student use of specific computer-aided instruction (CAI

    Halsey Martha


    Full Text Available Abstract Background The objective of this study was to test the hypothesis that personality preference, which can be related to learning style, influences individual utilization of CAI applications developed specifically for the undergraduate medical curriculum. Methods Personality preferences of students were obtained using the Myers-Briggs Type Indicator (MBTI test. CAI utilization for individual students was collected from entry logs for two different web-based applications (a discussion forum and a tutorial used in the basic science course on human anatomy. Individual login data were sorted by personality preference and the data statistically analyzed by 2-way mixed ANOVA and correlation. Results There was a wide discrepancy in the level and pattern of student use of both CAI. Although individual use of both CAI was positively correlated irrespective of MBTI preference, students with a "Sensing" preference tended to use both CAI applications more than the "iNtuitives". Differences in the level of use of these CAI applications (i.e., higher use of discussion forum vs. a tutorial were also found for the "Perceiving/Judging" dimension. Conclusion We conclude that personality/learning preferences of individual students influence their use of CAI in the medical curriculum.

  9. The Vibrio cholerae quorum-sensing autoinducer CAI-1: analysis of the biosynthetic enzyme CqsA

    Kelly, R.; Bolitho, M; Higgins, D; Lu, W; Ng, W; Jeffrey, P; Rabinowitz, J; Semmelhack, M; Hughson, F; Bassler, B


    Vibrio cholerae, the bacterium that causes the disease cholera, controls virulence factor production and biofilm development in response to two extracellular quorum-sensing molecules, called autoinducers. The strongest autoinducer, called CAI-1 (for cholera autoinducer-1), was previously identified as (S)-3-hydroxytridecan-4-one. Biosynthesis of CAI-1 requires the enzyme CqsA. Here, we determine the CqsA reaction mechanism, identify the CqsA substrates as (S)-2-aminobutyrate and decanoyl coenzyme A, and demonstrate that the product of the reaction is 3-aminotridecan-4-one, dubbed amino-CAI-1. CqsA produces amino-CAI-1 by a pyridoxal phosphate-dependent acyl-CoA transferase reaction. Amino-CAI-1 is converted to CAI-1 in a subsequent step via a CqsA-independent mechanism. Consistent with this, we find cells release {ge}100 times more CAI-1 than amino-CAI-1. Nonetheless, V. cholerae responds to amino-CAI-1 as well as CAI-1, whereas other CAI-1 variants do not elicit a quorum-sensing response. Thus, both CAI-1 and amino-CAI-1 have potential as lead molecules in the development of an anticholera treatment.

  10. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Hsin-Yih Shyu


    無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI) courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation) following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI cours...

  11. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Hsin-Yih Shyu


    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  12. In Situ Trace Element Analysis of an Allende Type B1 CAI: EK-459-5-1

    Jeffcoat, C. R.; Kerekgyarto, A.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.


    Variations in refractory major and trace element composition of calcium, aluminum-rich inclusions (CAIs) provide constraints on physical and chemical conditions and processes in the earliest stages of the Solar System. Previous work indicates that CAIs have experienced complex histories involving, in many cases, multiple episodes of condensation, evaporation, and partial melting. We have analyzed major and trace element abundances in two core to rim transects of the melilite mantle as well as interior major phases of a Type B1 CAI (EK-459-5-1) from Allende by electron probe micro-analyzer (EPMA) and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) to investigate the behavior of key trace elements with a primary focus on the REEs Tm and Yb.

  13. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Sunny S. J. Lin


    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  14. Computer Series, 25.

    Moore, John W., Ed.


    Nine computer programs (available from the authors) are described including graphic display of molecular structures from crystallographic data, computer assisted instruction (CAI) with MATH subroutine, CAI preparation-for-chemistry course, calculation of statistical thermodynamic properties, qualitative analysis program, automated conductimetric…

  15. The Relevance of AI Research to CAI.

    Kearsley, Greg P.

    This article provides a tutorial introduction to Artificial Intelligence (AI) research for those involved in Computer Assisted Instruction (CAI). The general theme is that much of the current work in AI, particularly in the areas of natural language understanding systems, rule induction, programming languages, and socratic systems, has important…

  16. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir


    Natural gas diffusion through the cap rock is mainly by means ofdissolving in water, so its concentration can be replaced by solubility, which varies with temperature, pressure and salinity in strata. Under certain geological conditions the maximal solubility is definite, so the diffusion com-putation can be handled approximately by stable state equation. Furthermore, on the basis of the restoration of the paleo-buried history, the diffusion is calculated with the dynamic method, and the result is very close to the real diffusion value in the geological history.

  17. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    FANG; Dequan; (


    preferred orientation of experimentally deformed quartzites, Geol. Soc. Am. Bull., 1973, 8: 297.[13]Ramsay, J. G., Huber, M., The Techniques of Modern Structural Geology, Vol. 1, Strain Analysis, New York: Academic Press, 1983, 73-124.[14]Li Shuguang, Ge Ningjie, Liu Deliang et al., The Sm-Nd isotopic age of C-type eclogite from the Dabie group in the northern Dabie mountains and its tectonic implication, Chinese Science Bulletin, 1989, 34(19): 1625.[15]Ye Bodan, Jian Ping, Xu Junwen et al., The Sujiahe Terrene Collage Belt and Its Constitution and Evolution Along the North Hillslope of the Tongbai-Dabie Orogenic Belt (in Chinese), Wuhan: China University of Geosciences Press, 1993, 1-69.[16]Jian Ping, Yan Weiran, Li Zhchang et al., Isotopic geochronological evidence for the Caledonian Xiongdian eclogite in the western Dabie mountains, China, Acta Geologica Sinica (in Chinese), 1997, 71(2): 133.[17]Liu Zhigang, Niu Baogui, Fu Yunlian et al., The tectonostratigraphic units at the northern foot of the Dabie mountains, Regional Geology of China (in Chinese), 1994, 13(1): 246.[18]Zhai Xiaoming, Day, H. W., Hacker, B. R. et al., Paleozoic metamorphism in the Qinling orogen, Tongbai Mountain, central China, Geology, 1998, 26: 371.[19]Li, S., Jagoutz., E., Xiao, Y. et al., Chronology of ultrahigh-pressure metamorphism in the Dabie Mountains and Su-Lu terrene: I. Sm-Nd isotope system, Science in China, Ser. D, 1996, 39(6): 597.[20]Zhang, Z., You, Z., Han, Y. et al., Petrology metamorphic process and genesis of the Dabie-Sulu eclogite belt, east-central China, Acta Geologica Sinica, 1995, 96(2): 306.[21]Cong Bolin, Wang Qingchen, The Dabie-Sulu UHP rocks belt: review and prospect, Chinese Science Bulletin, 1999, 44(12): 1074.[22]Xu Shutong, Jiang laili, Liu Yican et al., Tectonic framework and evolution of the Dabie mountains in Anhui, eastern China, Acta Geologica Sinica (in Chinese), 1992, 66(1): 1.[23]Ren Jishun, Niu Baogui, Liu Zhigang

  18. Maxi CAI with a Micro.

    Gerhold, George; And Others

    This paper describes an effective microprocessor-based CAI system which has been repeatedly tested by a large number of students and edited accordingly. Tasks not suitable for microprocessor based systems (authoring, testing, and debugging) were handled on larger multi-terminal systems. This approach requires that the CAI language used on the…

  19. A study on VR-based mutual adaptive CAI system for nuclear power plant

    A novel framework of human-computer-interaction for computer aided instruction (CAI) system is presented which aims at introducing a new off-the-job training environment to master nuclear power plant monitoring skill by more user-friendly manner than by present. The framework is based on the following two new ideas: one is mutual adaptive interface (MADI) concept, and the other is virtual realty (VR). In order to realize a hardware mechanism of mutual adaptive interface based on VR, a new head-mounted-display (HMD) was developed which can not only provide the user with virtual environment conventionally but also detect the user's eyes images for in-situ analysis of various ocular information. The information are expected to utilize for realizing advanced human-computer-interaction in the CAI system

  20. Computational movement analysis

    Laube, Patrick


    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  1. Computer Programming Job Analysis

    Debdulal Dutta Roy


    This study investigated relative uses of computer programming job characteristics across different organizations and effects of different demographic variables on job analysis ratings. Data were collected from 201 computer programers of 6 different organizations through checklist. Principal component analysis noted four mostly used job characteristics as program writing and testing, human relations, data analysis and user satisfaction. Of them only data analysis differed among different organ...

  2. Predicting low velocity impact damage and Compression-After-Impact (CAI) behaviour of composite laminates

    Tan, Wei; Falzon, Brian G.; Chiu, Louis N S; Price, Mark


    Low-velocity impact damage can drastically reduce the residual strength of a composite structure even when the damage is barely visible. The ability to computationally predict the extent of damage and compression-after-impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant time and cost penalties. A high-fidelity three-dimensional composite damage model, to predict both low-velocity impact damage and CAI st...

  3. Computational Music Analysis

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this...... well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  4. Computer aided safety analysis

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  5. CAIs in Semarkona (LL3.0)

    Mishra, R. K.; Simon, J. I.; Ross, D. K.; Marhas, K. K.


    Calcium, Aluminum-rich inclusions (CAIs) are the first forming solids of the Solar system. Their observed abundance, mean size, and mineralogy vary quite significantly between different groups of chondrites. These differences may reflect the dynamics and distinct cosmochemical conditions present in the region(s) of the protoplanetary disk from which each type likely accreted. Only about 11 such objects have been found in L and LL type while another 57 have been found in H type ordinary chondrites, compared to thousands in carbonaceous chondrites. At issue is whether the rare CAIs contained in ordinary chondrites truly reflect a distinct population from the inclusions commonly found in other chondrite types. Semarkona (LL3.00) (fall, 691 g) is the most pristine chondrite available in our meteorite collection. Here we report petrography and mineralogy of 3 CAIs from Semarkona

  6. Computational Analysis of Behavior.

    Egnor, S E Roian; Branson, Kristin


    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  7. Computational Music Analysis


    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern r...

  8. A Unified Framework for Producing CAI Melting, Wark-Lovering Rims and Bowl-Shaped CAIs

    Liffman, Kurt; Paterson, David A


    Calcium Aluminium Inclusions (CAIs) formed in the Solar System, some 4,567 million years ago. CAIs are almost always surrounded by Wark-Lovering Rims (WLRs), which are a sequence of thin, mono/bi-mineralic layers of refractory minerals, with a total thickness in the range of 1 to 100 microns. Recently, some CAIs have been found that have tektite-like bowl-shapes. To form such shapes, the CAI must have travelled through a rarefied gas at hypersonic speeds. We show how CAIs may have been ejected from the inner solar accretion disc via the centrifugal interaction between the solar magnetosphere and the inner disc rim. They subsequently punched through the hot, inner disc rim wall at hypersonic speeds. This re-entry heating partially or completely evaporated the CAIs. Such evaporation could have significantly increased the metal abundances of the inner disc rim. High speed movement through the inner disc produced WLRs. To match the observed thickness of WLRs required metal abundances at the inner disc wall that a...

  9. Shielding Benchmark Computational Analysis

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.


    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC).

  10. A Study on Application of CAI Dynamic Image-Guided Method in College Physical Education Technical Course

    Baokui Wang


    In this study, we have a study on application of CAI dynamic image-guided method in college physical education technical course. In college physical education teaching, the Computer-Assisted Instruction (CAI) dynamic image-guided method is employed to build the sport image diagnosis and implement 2-way feedback mechanism. This is for helping the students to create or modify the sport image, and strengthen the concept of action to set up the correct technical dynamic stereotype. The practice o...

  11. A Pseudo-Language for Creating CAI Programs on APL Systems

    Gucker, Edward J.


    Encourages the use of APL as a language for computer assisted instruction (CAI) instead of such languages as BASIC or COURSEWRITER. Describes a set of APL functions that can simulate to some extent the features of COURSEWRITER, while permitting a more experienced course author to use the full mathematical power of APL. (Author/JF)

  12. Using CAI To Enhance the Peer Acceptance of Mainstreamed Students with Mild Disabilities.

    Culliver, Concetta; Obi, Sunday

    This study applied computer-assisted instruction (CAI) techniques to improve peer acceptance among 92 mainstreamed students with mild disabilities from 10 to 13 years of age. Participants in the treatment group received their generalized curriculum program (including mathematics, language arts, reading, health, social studies, and science)…

  13. Web Pages: An Effective Method of Providing CAI Resource Material in Histology.

    McLean, Michelle


    Presents research that introduces computer-aided instruction (CAI) resource material as an integral part of the second-year histology course at the University of Natal Medical School. Describes the ease with which this software can be developed, using limited resources and available skills, while providing students with valuable learning…

  14. Why igneous wollastonite is so rare in CAIs

    Beckett, J. R.; Thrane, K.; Krot, A. N.


    Primary wollastonite (wo) thought to have crystallized from a liquid is quite rare in CAIs, having been reported in only two igneous inclusions, White Angel and KT-1 [1, 2]. Both of these CAIs exhibit significant mass fractionations in multiple elements and KT-1 is a FUN inclusion, so it is highly desirable to place as many constraints as possible on their formation. Since phase diagrams previously developed for CAIs do not involve wo [3], we use literature data on wo-satura...

  15. E-CAI: a novel server to estimate an expected value of Codon Adaptation Index (eCAI

    Garcia-Vallvé Santiago


    Full Text Available Abstract Background The Codon Adaptation Index (CAI is a measure of the synonymous codon usage bias for a DNA or RNA sequence. It quantifies the similarity between the synonymous codon usage of a gene and the synonymous codon frequency of a reference set. Extreme values in the nucleotide or in the amino acid composition have a large impact on differential preference for synonymous codons. It is thence essential to define the limits for the expected value of CAI on the basis of sequence composition in order to properly interpret the CAI and provide statistical support to CAI analyses. Though several freely available programs calculate the CAI for a given DNA sequence, none of them corrects for compositional biases or provides confidence intervals for CAI values. Results The E-CAI server, available at, is a web-application that calculates an expected value of CAI for a set of query sequences by generating random sequences with G+C and amino acid content similar to those of the input. An executable file, a tutorial, a Frequently Asked Questions (FAQ section and several examples are also available. To exemplify the use of the E-CAI server, we have analysed the codon adaptation of human mitochondrial genes that codify a subunit of the mitochondrial respiratory chain (excluding those genes that lack a prokaryotic orthologue and are encoded in the nuclear genome. It is assumed that these genes were transferred from the proto-mitochondrial to the nuclear genome and that its codon usage was then ameliorated. Conclusion The E-CAI server provides a direct threshold value for discerning whether the differences in CAI are statistically significant or whether they are merely artifacts that arise from internal biases in the G+C composition and/or amino acid composition of the query sequences.

  16. Analysis of computer programming languages

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  17. Effective Computer Aided Instruction in Biomedical Science

    Hause, Lawrence L.


    A menu-driven Computer Aided Instruction (CAI) package was integrated with word processing and effectively applied in five curricula at the Medical College of Wisconsin. Integration with word processing facilitates the ease of CAI development by instructors and was found to be the most important step in the development of CAI. CAI modules were developed and are currently used to reinforce lectures in medical pathology, laboratory quality control, computer programming and basic science reviews...

  18. Analysis of computer networks

    Gebali, Fayez


    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  19. Affective Computing and Sentiment Analysis

    Ahmad, Khurshid


    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  20. Development of an intelligent CAI system for a distributed processing environment

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  1. Research on the Use of Computer-Assisted Instruction.

    Craft, C. O.


    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  2. Computer vision in microstructural analysis

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.


    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  3. Produktový mix firmy Pekařství Cais

    NOVÁKOVÁ, Iveta


    The aim of my thesis was to describe product mix in a chosen company. I chose bakery Vladimír Cais in Vlachovo Březí for this work. Another aim was to analyze the product portfolio by means of the Boston Matrix and to propose possible modifications of the product portfolio based on the results. There were also a SWOT analysis and a product life cycle compiled within the analytic part.

  4. Computer aided safety analysis 1989

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  5. Computational analysis of cerebral cortex

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)


    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  6. Computational system for geostatistical analysis

    Vendrusculo Laurimar Gonçalves


    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  7. Forensic Analysis of Compromised Computers

    Wolfe, Thomas


    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  8. Computability and Analysis, a Historical Approach

    Brattka, Vasco


    The history of computability theory and and the history of analysis are surprisingly intertwined since the beginning of the twentieth century. For one, \\'Emil Borel discussed his ideas on computable real number functions in his introduction to measure theory. On the other hand, Alan Turing had computable real numbers in mind when he introduced his now famous machine model. Here we want to focus on a particular aspect of computability and analysis, namely on computability properties of theorem...

  9. A Petaflops Era Computing Analysis

    Preston, Frank S.


    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  10. Personal Computer Transport Analysis Program

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter


    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  11. Phenotypic diversity and correlation between white-opaque switching and the CAI microsatellite locus in Candida albicans.

    Hu, Jian; Guan, Guobo; Dai, Yu; Tao, Li; Zhang, Jianzhong; Li, Houmin; Huang, Guanghua


    Candida albicans is a commensal fungal pathogen that is often found as part of the human microbial flora. The aim of the present study was to establish a relationship between diverse genotypes and phenotypes of clinical isolates of C. albicans. Totally 231 clinical isolates were collected and used for genotyping and phenotypic switching analysis. Based on the microsatellite locus (CAI) genotyping assay, 65 different genotypes were identified, and some dominant types were found in certain human niches. For example, the genotypes of 30-44 and 30-45 were enriched in vaginal infection samples. C. albicans has a number of morphological forms including the single-celled yeasts, multicellular filaments, white, and opaque cell types. The relationship between the CAI genotype and the ability to undergo phenotypic switching was examined in the clinical isolates. We found that the strains with longer CAA/G repeats in both alleles of the CAI locus were more opaque competent. We also discovered that some MTL heterozygous (a/alpha) isolates could undergo white-opaque switching when grown on regular culture medium (containing glucose as the sole carbon source). Our study establishes a link between phenotypic switching and genotypes of the CAI microsatellite locus in clinical isolates of C. albicans. PMID:26832141

  12. Relationship between Pre-Service Music Teachers' Personality and Motivation for Computer-Assisted Instruction

    Perkmen, Serkan; Cevik, Beste


    The main purpose of this study was to examine the relationship between pre-service music teachers' personalities and their motivation for computer-assisted music instruction (CAI). The "Big Five" Model of Personality served as the framework. Participants were 83 pre-service music teachers in Turkey. Correlation analysis revealed that three…

  13. Numerical Analysis of Multiscale Computations

    Engquist, Björn; Tsai, Yen-Hsi R


    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  14. Interactive computer programs in sequence data analysis.

    Jagadeeswaran, P; McGuire, P M


    We present interactive computer programs for the analysis of nucleic acid sequences. In order to handle these programs, minimum computer experience is sufficient. The nucleotide sequence of the human gamma globin gene complex is used as an example to illustrate the data analysis.

  15. Silicon Isotopic Fractionation of CAI-like Vacuum Evaporation Residues

    Knight, K; Kita, N; Mendybaev, R; Richter, F; Davis, A; Valley, J


    Calcium-, aluminum-rich inclusions (CAIs) are often enriched in the heavy isotopes of magnesium and silicon relative to bulk solar system materials. It is likely that these isotopic enrichments resulted from evaporative mass loss of magnesium and silicon from early solar system condensates while they were molten during one or more high-temperature reheating events. Quantitative interpretation of these enrichments requires laboratory determinations of the evaporation kinetics and associated isotopic fractionation effects for these elements. The experimental data for the kinetics of evaporation of magnesium and silicon and the evaporative isotopic fractionation of magnesium is reasonably complete for Type B CAI liquids (Richter et al., 2002, 2007a). However, the isotopic fractionation factor for silicon evaporating from such liquids has not been as extensively studied. Here we report new ion microprobe silicon isotopic measurements of residual glass from partial evaporation of Type B CAI liquids into vacuum. The silicon isotopic fractionation is reported as a kinetic fractionation factor, {alpha}{sub Si}, corresponding to the ratio of the silicon isotopic composition of the evaporation flux to that of the residual silicate liquid. For CAI-like melts, we find that {alpha}{sub Si} = 0.98985 {+-} 0.00044 (2{sigma}) for {sup 29}Si/{sup 28}Si with no resolvable variation with temperature over the temperature range of the experiments, 1600-1900 C. This value is different from what has been reported for evaporation of liquid Mg{sub 2}SiO{sub 4} (Davis et al., 1990) and of a melt with CI chondritic proportions of the major elements (Wang et al., 2001). There appears to be some compositional control on {alpha}{sub Si}, whereas no compositional effects have been reported for {alpha}{sub Mg}. We use the values of {alpha}Si and {alpha}Mg, to calculate the chemical compositions of the unevaporated precursors of a number of isotopically fractionated CAIs from CV chondrites whose

  16. Computational methods for global/local analysis

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.


    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  17. Adjustment computations spatial data analysis

    Ghilani, Charles D


    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  18. The ethnoecology of Caiçara metapopulations (Atlantic Forest, Brazil): ecological concepts and questions

    Begossi Alpina


    Abstract The Atlantic Forest is represented on the coast of Brazil by approximately 7,5% of remnants, much of these concentrated on the country's SE coast. Within these southeastern remnants, we still find the coastal Caiçaras who descend from Native Indians and Portuguese Colonizers. The maintenance of such populations, and their existence in spite of the deforestation that occurred on the Atlantic Forest coast, deserves especial attention and analysis. In this study, I address, in particula...

  19. Impact analysis on a massively parallel computer

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  20. Applied time series analysis and innovative computing

    Ao, Sio-Iong


    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  1. Computational methods in power system analysis

    Idema, Reijer


    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  2. The Intelligent CAI System for Chemistry Based on Automated Reasoning

    王晓京; 张景中


    A new type of intelligent CAI system for chemistry is developed in this paper based on automated reasoning with chemistry knowledge.The system has shown its ability to solve chemistry problems,to assist students and teachers in studies and instruction with the automated reasoning functions.Its open mode of the knowledge base and its unique style of the interface between the system and human provide more opportunities for the users to acquire living knowledge through active participation.The automated reasoning based on basic chemistry knowledge also opened a new approach to the information storage and management of the ICAI system for sciences.

  3. Distributed computing and nuclear reactor analysis

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  4. Automating sensitivity analysis of computer models using computer calculus

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  5. Automating sensitivity analysis of computer models using computer calculus

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  6. Computer analysis of ESR spectra

    Author. Isotropic ESR spectra often display complicated patterns which are difficult to analyze for their hyperfine splitting constants (HSC). To simplify the analysis, we have written a program suitable for PC's for sufficiently iterating simulations of isotropic ESR spectra and determining the simulation which fits the experimental spectra. Chapter one gives a brief introduction to the theory of electron spin resonance (ESR). In chapter two the main concepts of the program are presented. Auto simulate is the main algorithm. It calculates the entire field of valid simulations to ensure that the solution set contains all parameter combinations which produce satisfactory spectra. Auto simulate requires prior knowledge of the HSCs and other parameters needed for the simulation such as the line width, the spectrum width, and the number of magnetic nuclei. Proton Coupling Constant Extraction (PCCE) and autocorrelation are two methods complementing each other to determine the HSCs. Another iterative method based on a systematic application of Monte Carlo method can be applied to generate more accurate values of the line width. In chapter three, the spectra of Naphthalene, Tetracene, Indigo, Ox-indigo semi quinone, thio-indigo and 2,2'-dipyridyl-Na complex free radicals are analyzed. The results are compared to the literature value, good agreement is obtained for different resolution and noise to signal ratios. In the last chapter a print out of the program is presented. The programming language used is Microsoft QuickBASIC version 7.1

  7. Computer aided nonlinear electrical networks analysis

    Slapnicar, P.


    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  8. Computer graphics in reactor safety analysis

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  9. Interactive computer analysis of nuclear backscattering spectra

    A review will be made of computer-based interactive nuclear backscattering analysis system. Users without computer experience can develop moderate competence with the system after only brief instruction because of the menu-driven organization. Publishable quality figures can be obtained without any computer expertise. Among the quantities which can be displayed over the data are depth scales for any element, element identification, relative concentrations and theoretical spectra. Captions and titling can made from a selection of 30 font styles. Lettering is put on the graphs umder joy-stick control such that placement is exact without needing complicated commands. (orig.)

  10. Computer Language Effciency via Data Envelopment Analysis

    Andrea Ellero


    Full Text Available The selection of the computer language to adopt is usually driven by intuition and expertise, since it is very diffcult to compare languages taking into account all their characteristics. In this paper, we analyze the effciency of programming languages through Data Envelopment Analysis. We collected the input data from The Computer Language Benchmarks Game: we consider a large set of languages in terms of computational time, memory usage, and source code size. Various benchmark problems are tackled. We analyze the results first of all considering programming languages individually. Then, we evaluate families of them sharing some characteristics, for example, being compiled or interpreted.

  11. Computational structural analysis and finite element methods

    Kaveh, A


    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  12. CAI in New York City: Report on the First Year's Operations

    Butler, Cornelius F.


    "The nation's largest CAI operation in a public school system concluded its first full year of operation in June, 1969. The results indicate a very definite success for education's most closely watched use of technology. Three major criteria for success of such a project are 1) acceptance of CAI by the schools and their pupils, 2) per pupil costs…

  13. Structural basis of Na+-independent and cooperative substrate/product antiport in CaiT

    Schulze, Sabrina; Köster, Stefan; Geldmacher, Ulrike; Terwisscha van Scheltinga, Anke C.; Kühlbrandt, Werner


    Transport of solutes across biological membranes is performed by specialized secondary transport proteins in the lipid bilayer, and is essential for life. Here we report the structures of the sodium-independent carnitine/butyrobetaine antiporter CaiT from Proteus mirabilis (PmCaiT) at 2.3-Å and from

  14. Brief Introduction to the Foundation of CAI Shidong Award for Plasma Physics

    SHENG Zhengming


    @@ The late Academician Professor CAI Shidong was an outstanding plasma physicist who had made seminal contributions in both fundamental plasma theories and controlled thermonuclear fusion energy research.Professor CAI was also one of the pioneers in China's plasma physics research.In 1973,Professor CAI decided to leave U.S.and return to China in order to help pushing forward plasma physics research in China.Professor CAI formed a research group consisting of young scientists and carried out high-level works in this important physics discipline.He worked tirelessly,set examples by his own deeds,and made outstanding contributions in plasma physics research,educating younger generations of plasma physicists,as well as establishing collaborations with plasma scientists in other Asian-African developing nations.In short,Professor CAI devoted the best years of his life to China's plasma physics research.

  15. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.;


    We present a detailed characterization of the mineralogy, petrology, and oxygen isotopic compositions of twelve FUN CAIs, including C1 and EK1-4-1 from Allende (CV), that were previously shown to have large isotopic fractionation patterns for magnesium and oxygen, and large isotopic anomalies of...... several elements. The other samples show more modest patterns of isotopic fractionation and have smaller but significant isotopic anomalies. All FUN CAIs studied are coarse-grained igneous inclusions: Type B, forsterite-bearing Type B, compact Type A, and hibonite-rich. Some inclusions consist of two...... mineralogically distinct lithologies, forsterite-rich and forsterite-free/poor. All the CV FUN CAIs experienced postcrystallization open-system iron-alkali-halogen metasomatic alteration resulting in the formation of secondary minerals commonly observed in non-FUN CAIs from CV chondrites. The CR FUN CAI GG#3...

  16. Safety analysis of control rod drive computers

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP)

  17. Computation of Regularized Linear Discriminant Analysis

    Kalina, Jan; Valenta, Zdeněk; Duintjer Tebbens, Jurjen

    ISI, 2014. s. 8-8. [COMPSTAT 2014. International Conference on Computational Statistics /21./. 19.08.2014-22.08.2014, Geneva] Institutional support: RVO:67985807 Keywords : classification analysis * regularization * Matrix decomposition * shrinkage eigenvalues * high-dimensional data Subject RIV: BB - Applied Statistics, Operational Research

  18. Computation for the analysis of designed experiments

    Heiberger, Richard


    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  19. Risk analysis enhancement via computer applications

    Since the development of Reliability Centered Maintenance (RCM) by the airline industry, there has been various alternative approaches to applying this methodology to the nuclear power industry. Some of the alternatives were developed in order to shift the focus of analyses on plant specific concerns but the greatest majority of alternatives were developed in attempt to reduce the effort required to conduct a RCM analysis on as large of scale as a nuclear power station. Computer applications have not only reduced the amount of analysis time but have also produced more consistent results, provided an effective working RCM analysis tool and made it possible to automate a Living Program. During the development of a RCM Program at South Carolina Electric and Gas' V.C. Summer Nuclear Station (VCSNS), computer applications were developed. 6 figs, 1 tab

  20. Codesign Analysis of a Computer Graphics Application

    Madsen, Jan; Brage, Jens P.


    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...... on information obtained from software profiling and the resulting design is validated through cosimulation. The achieved speed-up is estimated based on an analysis of profiling information from different sets of input data and various architectural options....

  1. Computation system for nuclear reactor core analysis

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  2. Codesign Analysis of a Computer Graphics Application

    Madsen, Jan; Brage, Jens P.

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...... on information obtained from software profiling and the resulting design is validated through cosimulation. The achieved speed-up is estimated based on an analysis of profiling information from different sets of input data and various architectural options....

  3. Computer-aided power systems analysis

    Kusic, George


    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  4. The impact of computer-based interactive instruction (CBII) in improving the teaching-learning process in introductory college physics

    Jawad, Afif A.

    Institutes are incorporating computer-assisted instruction (CAI) into their classrooms in an effort to enhance learning. The implementation of computers into the classroom is parallel with education's role of keeping abreast with societal demands. The number of microcomputers in schools has increased tremendously. Computer Based Interactive Instruction (CBBI) software is available for the language arts, mathematics, science, social studies, etc. The traditional instruction, supplemented with CAI, seems to be more effective than traditional instruction alone. Although there is a large quantity of research regarding specific aspects of learning through computers, there seems to be a lack of information regarding the impact of computers upon student success. The goal of this study is to determine how much of CAI is implemented in higher education in the USA. Instructors from 38 states were surveyed to compare between the institutes that use Computer Based Interactive Instruction and the ones that do not and are still applying traditional delivery method. Based on the analysis of the data gathered during this study, it is concluded that the majority of instructors are now using computers in one form or another. This study has determined that the computer is a major component in the teaching of introductory physics, and therefore, may be a suitable substitute for the traditional delivery system. Computers as an instructional delivery system are an alternative that may result in a higher level of student learning for many higher education courses.

  5. Probabilistic structural analysis computer code (NESSUS)

    Shiao, Michael C.


    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  6. Al-Mg systematics of CAIs, POI, and ferromagnesian chondrules from Ningqiang

    Hsu, Weibiao; Huss, Gary R.; Wasserburg, G. J.


    We have made aluminum-magnesium isotopic measurements on 4 melilite-bearing calcium-aluminum-rich inclusions (CAIs), 1 plagioclase-olivine inclusion (POI), and 2 ferromagnesian chondrules from the Ningqiang carbonaceous chondrite. All of the CAIs measured contain clear evidence for radiogenic ^(26)Mg^* from the decay of ^(26)Al (τ = 1.05 Ma). Although the low Al/Mg ratios of the melilites introduce large uncertainties, the inferred initial ^(26)Al/^(27)Al ratios for the CAIs are generally con...

  7. DC operating point analysis using evolutionary computing

    Crutchley, DA; Zwolinski, M.


    This paper discusses and evaluates a new approach to operating point analysis based on evolutionary computing (EC). EC can find multiple solutions to a problem by using a parallel search through a population. At the operating point(s) of a circuit the overall error has a minimum value. Therefore, we use an Evolutionary Algorithm (EA) to search the solution space to find these minima. Various evolutionary algorithms are described. Several such algorithms have been implemented in a full circuit...

  8. Computed tomographic analysis of urinary calculi

    Newhouse, J.H.; Prien, E.L.; Amis, E.S. Jr.; Dretler, S.P.; Pfister, R.C.


    Excised urinary calculi were subjected to computed tomographic (CT) scanning in an attempt to determine whether CT attenuation values would allow accurate analysis of stone composition. The mean, maximum, and modal pixel densities of the calculi were recorded and compared; the resulting values reflected considerable heterogeneity in stone density. Although uric acid and cystine calculi could be identified by their discrete ranges on one or more of these criteria, calcium-containing stones of various compositions, including struvite, could not be distinguished reliably. CT analysis of stone density is not likely to be more accurate than standard radiography in characterizing stone composition in vivo.

  9. Computer analysis of HIV epitope sequences

    Gupta, G.; Myers, G.


    Phylogenetic tree analysis provide us with important general information regarding the extent and rate of HIV variation. Currently we are attempting to extend computer analysis and modeling to the V3 loop of the type 2 virus and its simian homologues, especially in light of the prominent role the latter will play in animal model studies. Moreover, it might be possible to attack the slightly similar V4 loop by this approach. However, the strategy relies very heavily upon natural'' information and constraints, thus there exist severe limitations upon the general applicability, in addition to uncertainties with regard to long-range residue interactions. 5 refs., 3 figs.

  10. CAI课件在《家畜寄生虫学》教学中的应用%The use of CAI courseware in veterinary parasitology teaching

    王建民; 姚龙泉; 刘明春; 何剑斌; 葛云侠


    通过多种途径收集素材,制备适合动物医学专业学生使用的《家畜寄生虫学》CAI课件,使原来枯燥的讲解课程变成生动的展示课程.该课件帮助同学们日后对寄生虫病诊断以及寄生虫分类奠定了良好基础.%Collecting materials by diverseness ways, and preparation computer assisted instruction ( CAI) courseware of animal parasitology which was suit for animal medicine undergraduate students using. The CAI of animal parasitology had turned the original boring lectures into vivid display courses. The courseware would establish a satisfactory basis for students diagnosis and classification parasitosis in the future work.

  11. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    V. V. Postnov


    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  12. 77 FR 9625 - Presentation of Final Conventional Conformance Test Criteria and Common Air Interface (CAI...


    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE National Institute of Standards and Technology Presentation of Final Conventional Conformance Test Criteria and Common Air Interface (CAI) Features/Functionalities Under Test in the Project 25...

  13. Introduction to scientific computing and data analysis

    Holmes, Mark H


    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  14. Strong Calcite-Like Spectra Cathodoluminescence Emission from Allende Meteorite Cai Phases

    García Guinea, Javier; Tornos Arroyo, Fernando; Azumendi García, Oscar; Ruiz Pérez, Javier; Correcher Delgado, Virgilio


    Calcium–aluminum-rich inclusions (CAIs) of Allende CV3 chondrite were studied by Environmental Scanning Electron Microscopy (ESEM), Energy Dispersive Spectrometry (EDS), Backscattering (BS), and Spectra Cathodoluminescence (CL). CAI minerals show spectra CL curves exceeding the 450,000 a.u. with a large homogeneity along the white inclusions. CL curve features fit perfectly with terrestrial patterns of stressed specimens of weathered marble and limestone in which hydroxyl gr...

  15. Design of CAI Courseware Based on Virtual Reality Mechanism%基于VR机制的CAI课件设计



    In this paper,the application feature and significance of VR technology in the educational field are summarized.In particular,the design mechanism of CAI courseware of the instruction aiming at individuals is studied,and with virtual reality mechanism a learning-while-doing environment is realized for the user in the CAI courseware in the major of the computer application.The design theory,the technique way,some of the algorithm flowchart and the interface of the exercise of operation are given.%论述了虚拟现实技术在教育领域中的应用特点和重要意义。特别研究了针对个别化教学的CAI课件设计机制,并运用虚拟现实机制在计算机应用类CAI课件设计中实现了一个可供用户边学边做的学习环境。给出了设计原理、技术路线、部分算法流程和操作练习界面。

  16. Analysis of a Model for Computer Virus Transmission

    Peng Qin


    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  17. Computed image analysis of neutron radiographs

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  18. Computation of Regularized Linear Discriminant Analysis

    Kalina, Jan; Valenta, Zdeněk; Duintjer Tebbens, Jurjen

    Geneva: Centre International de Conferences, 2014 - (Gilli, M.; Nieto-Reyes, A.; González-Rodríguez, G.), s. 1-8 ISBN 978-2-8399-1347-8. [COMPSTAT 2014. International Conference on Computational Statistics /21./. Geneva (CH), 19.08.2014-22.08.2014] R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : classification analysis * regularization * Matrix decomposition * shrinkage eigenvalues * high-dimensional data Subject RIV: BB - Applied Statistics, Operational Research

  19. Computer modelling for LOCA analysis in PHWRs

    A computer code THYNAC developed for analysis of thermal hydraulic transient phenomena during LOCA in the PHWR type reactor and primary coolant system is described. The code predicts coolant voiding rate in the core, coolant discharge rate from the break, primary system depressurization history and temperature history of both fuel and fuel clad. Reactor system is modelled as a set of connected fluid segments which represent piping, feeders, coolant channels, etc. Method of finite difference is used in the code. Modelling of various specific phenomena e.g. two-phase pressure drop, slip flow, pumps etc. in the code is described. (M.G.B.)

  20. Progress in computer vision and image analysis

    Bunke, Horst; Sánchez, Gemma; Otazu, Xavier


    This book is a collection of scientific papers published during the last five years, showing a broad spectrum of actual research topics and techniques used to solve challenging problems in the areas of computer vision and image analysis. The book will appeal to researchers, technicians and graduate students. Sample Chapter(s). Chapter 1: An Appearance-Based Method for Parametric Video Registration (2,352 KB). Contents: An Appearance-Based Method for Parametric Video Registration (X Orriols et al.); Relevance of Multifractal Textures in Static Images (A Turiel); Potential Fields as an External

  1. FORTRAN computer program for seismic risk analysis

    McGuire, Robin K.


    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  2. Social sciences via network analysis and computation

    Kanduc, Tadej


    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  3. Microstructure and effective behavior - analysis and computation

    Material behavior is determined by features on a number of length scales between the atomistic and macroscopic scale. As full direct resolution of all scales is out of reach there is an intense research on analytical and computational tools that can bridge different scales and a number of different schemes have been proposed. One key issue is to identify which information on the finer scale is needed to determine the behavior on the coarser scale. To shed some light on this issue we will focus on number of case studies to understand the passage from macroscopic scales, where the material is described by a multi-well non-convex energy, to macroscopic behavior. Examples include shape-memory materials, new giant magnetostrictive materials and nematic elastomers. Similar ideas have been used by others and by us to understand dislocation arrangements, blistering of thin films and magnetic microstructures. We will discuss three algorithmic approaches to analyze effective behavior: purely analytical, hybrid analytical-computational and computation inspired by analysis. Refs. 5 (author)

  4. Computer network environment planning and analysis

    Dalphin, John F.


    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  5. Symbolic Computing in Probabilistic and Stochastic Analysis

    Kamiński Marcin


    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  6. Experimental analysis of computer system dependability

    Iyer, Ravishankar, K.; Tang, Dong


    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  7. Computational methods for nuclear criticality safety analysis

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  8. Good relationships between computational image analysis and radiological physics

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics

  9. Good relationships between computational image analysis and radiological physics

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen


    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  10. Good relationships between computational image analysis and radiological physics

    Arimura, Hidetaka, E-mail: [Division of Medical Quantum Science, Department of Health Sciences, Faculty of Medical Sciences, Kyushu University (Japan); Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen [Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University (Japan)


    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  11. The Anatomy and Bulk Composition of CAI Rims in the Vigarano (CV3) Chondrite

    Ruzicka, A.; Boynton, W. V.


    A striking feature of Ca,Al-rich inclusions (CAIs) in chondrites is the presence of mineralogical layers that typically form rim sequences up to 50 micrometers thick [1]. Many ideas regarding the origin of CAI rims have been proposed, but none are entirely satisfactory. The detailed mineralogy and bulk compositions of relatively unaltered CAI rims in the Vigarano (CV3) chondrite described here provide constraints on hypotheses of rim formation. Rim Mineralogy: CAIs in Vigarano consist of melilite (mel)- and spinel (sp)- rich varieties, both of which are rimmed [2]. Around mel-rich objects, the layer sequence is CAI interior --> sp-rich layer (sometimes absent) --> mel/anorthite (anor) layer --> Ti-Al-rich clinopyroxene (Tpx) layer --> Al- diopside (Al-diop) layer --> olivine (ol) +/- Al-diop layer --> host matrix. The sequence around sp-rich objects differs from this in that the mel/anor layer is absent. Both the sp-rich layer around mel-cored CAIs and the cores of sp-rich CAIs in Vigarano are largely comprised of a fine-grained (anor layer is sometimes monomineralic, consisting of mel alone, or bimineralic, consisting of both mel and anor. Where bimineralic, anor typically occurs in the outer part of the layer. In places, anor (An(sub)99-100) has partially altered to nepheline and voids. Rim mel is systematically less gehlenitic than mel in the CAI interiors, especially compared to mel in the interior adjacent to the rims. The Tpx layer (>2 and up to 15 wt% TiO2) and Al-diop layer ( sp + fo --> sp + fo + anor or mel or Tpx) that does not correspond to observed rim sequences. It thus appears that (1) the rim region did not form through crystallization of molten CAIs; and (2) rim layers did not originate solely by the crystallization of a melt layer present on a solid CAI core [4,5]. References: [1] Wark D. A. and Lovering J. F. (1977) Proc. LSC 8th, 95-112. [2] Ruzicka A. and Boynton W. V. (1991) Meteoritics, 26, 390-391. [3] Stolper E. (1982) GCA, 46, 2159

  12. Framework for Computer Assisted Instruction Courseware: A Case Study.

    Betlach, Judith A.


    Systematically investigates, defines, and organizes variables related to production of internally designed and implemented computer assisted instruction (CAI) courseware: special needs of users; costs; identification and definition of realistic training needs; CAI definition and design methodology; hardware and software requirements; and general…

  13. Computational analysis of PARAMETR facility experiments

    Full text of publication follows: Results of calculation of PARAMETR experiments are given in the paper. The PARAMETR facility is designed to research the phenomena relevant to typical LOCA scenarios (including severe accident) of VVER type reactors. The investigations at PARAMETR facility are directed to experimental research of fuel rods and core materials behavior, hydrogen generation processes, melting and interaction of core materials during severe accidents. The main facility components are rod bundle of 1250 mm heated length (up to 37 rods can be used), electrical power source, steam and water supply systems and instrumentation. The bundle is a mix of fresh fuel rods and electrically heated rods with uranium tablets and tungsten heater inside. The main objectives of calculations are analysis of computer code capability, in particular, RELAP/SCDAPSIM, to model severe accidents, identification of major parameter impact on calculation results and thus accident analysis improvements. RELAP/SCDAPSIM calculations were used to choose key parameters of experiments. Analysis of influence of thermal insulation properties, uncertainties of heater geometry, insulation thermal conductivity was done. Conditions and parameters needed to burn up intensive zirconium reaction were investigated. As a whole, calculation results showed good agreement with experiments. Some key points were observed such as essential impact of preheating phase, importance of thermal insulation material properties. Proper modeling of particular processes during preheating phase was very important since this phase defined bundle temperature level at the heating phase. There were some difficulties here. For instance, overestimation of temperatures had been observed until axial profiling of thermal conductivity was introduced. Some more proper models were used to reach the better agreement with experiments. The work done can be used in safety analysis of VVER type reactors and allow improving of

  14. Computer-aided Fault Tree Analysis

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  15. Computed tomographic analysis of renal calculi

    Hillman, B.J.; Drach, G.W.; Tracey, P.; Gaines, J.A.


    An in vitro study sought to determine the feasibility of using computed tomography (CT) to analyze the chemical composition of renal calculi and thus aid in selecting the best treatment method. Sixty-three coded calculi were scanned in a water bath. Region-of-interest measurements provided the mean, standard deviation, and minimum and maximum pixel values for each stone. These parameters were correlated with aspects of the stones' chemical composition. A multivariate analysis showed that the mean and standard deviation of the stones' pixel values were the best CT parameters for differentiating types of renal calculi. By using computerized mapping techniques, uric acid calculi could be perfectly differentiated from struvite and calcium oxalate calculi. The latter two types also were differentiable, but to a lesser extent. CT has a potential role as an adjunct to clinical and laboratory methods for determining the chemical composition of renal calculi in an effort to select optimal treatment.

  16. Review of Computational Stirling Analysis Methods

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.


    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  17. Computational Models for Analysis of Illicit Activities

    Nizamani, Sarwat

    devise policies to minimize them. These activities include cybercrimes, terrorist attacks or violent actions in response to certain world issues. Beside such activities, there are several other related activities worth analyzing, for which computational models have been presented in this thesis....... These models include a model for analyzing evolution of terrorist networks; a text classification model for detecting suspicious text and identification of suspected authors of anonymous emails; and a semantic analysis model for news reports, which may help analyze the illicit activities in certain area...... with location and temporal information. For the network evolution, the hierarchical agglomerative clustering approach has been applied to terrorist networks as case studies. The networks' evolutions show that how individual actors who are initially isolated from each other are converted in small groups, which...

  18. Computed tomographic analysis of renal calculi

    An in vitro study sought to determine the feasibility of using computed tomography (CT) to analyze the chemical composition of renal calculi and thus aid in selecting the best treatment method. Sixty-three coded calculi were scanned in a water bath. Region-of-interest measurements provided the mean, standard deviation, and minimum and maximum pixel values for each stone. These parameters were correlated with aspects of the stones' chemical composition. A multivariate analysis showed that the mean and standard deviation of the stones' pixel values were the best CT parameters for differentiating types of renal calculi. By using computerized mapping techniques, uric acid calculi could be perfectly differentiated from struvite and calcium oxalate calculi. The latter two types also were differentiable, but to a lesser extent. CT has a potential role as an adjunct to clinical and laboratory methods for determining the chemical composition of renal calculi in an effort to select optimal treatment

  19. Computational based functional analysis of Bacillus phytases.

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti


    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  20. Mineralogy and Petrology of EK-459-5-1, A Type B1 CAI from Allende

    Jeffcoat, C. R.; Kerekgyarto, A. G.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.


    Calcium-aluminum-rich inclusions (CAIs) are a type of coarse-grained clast composed of Ca-, Al-, and Mg-rich silicates and oxides found in chondrite meteorites. Type B (CAIs) are exclusively found in the CV chondrite meteorites and are the most well studied type of inclusion found in chondritic meteorites. Type B1 CAIs are distinguished by a nearly monomineralic rim of melilite that surrounds an interior predominantly composed of melilite, fassaite (Ti and Al-rich clinopyroxene), anorthite, and spinel with varying amounts of other minor primary and secondary phases. The formation of Type B CAIs has received considerable attention in the course of CAI research and quantitative models, experimental results and observations from Type B inclusions remain largely in disagreement. Recent experimental results and quantitative models have shown that the formation of B1 mantles could have occurred by the evaporative loss of Si and Mg during the crystallization of these objects. However, comparative studies suggest that the lower bulk SiO2 compositions in B1s result in more prior melilite crystallization before the onset of fassaite and anorthite crystallization leading to the formation of thick melilite rich rims in B1 inclusions. Detailed petrographic and cosmochemical studies of these inclusions will further our understanding of these complex objects.

  1. Incremental ALARA cost/benefit computer analysis

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Stocker, John C.; Golomb, Andrew M.


    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Computer Assisted Laboratory Instructions: Learning Outcomes Analysis

    Abdulrasool, Salah Mahdi; Mishra, Rakesh


    For this students in mechanical engineering subject area were exposed to computer assisted instructions to satisfy following learning outcomes in computer aided design/computer aided manufacturing module. i- Creation of drawing and design using Computer aided design ii- Using data exchange format (DXF) to create numerical control file iii- Final setup check of computerised numerical control machine iv- Final manufacturing of the product using CNC v- e ytilauQ valuation The t...

  4. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  5. Research in applied mathematics, numerical analysis, and computer science


    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  6. Computational intelligence for big data analysis frontier advances and applications

    Dehuri, Satchidananda; Sanyal, Sugata


    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  7. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Mária Ďurišová


    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  8. Computational systems analysis of dopamine metabolism.

    Zhen Qi

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  9. [Computational genome analysis of three marine algoviruses].

    Stepanova, O A; Boĭko, A L; Shcherbatenko, I S


    Computational analysis of genomic sequences of three new marine algoviruses: Tetraselmis viridis virus (TvV-S20 and TvV-SI1 strains) and Dunaliella viridis virus (DvV-SI2 strain) was conducted. Both considerable similarity and essential distinctions between studied strains and the most studied marine algoviruses of Phycodnaviridae family were revealed. Our data show that the tested strains are new viruses with the following features: only they were isolated from marine eukaryotic microalgae T. viridis and D. viridis, coding sequences (CDSs) of their genomes are localized mainly on one of the DNA strands and form several clusters with short intergenic spaces; there are considerable variations in genome structure within viruses and their strains; viral genomic DNA has a high GC-content (55.5 - 67.4%); their genes contain no well-known optimal contexts of translation start codones, and the contexts of terminal codons read-through; the vast majority of viral genes and proteins do not have any matches in gene banks. PMID:24479317

  10. The CMS computing, software and analysis challenge

    The CMS experiment has performed a comprehensive challenge during May 2008 to test the full scope of offline data handling and analysis activities needed for data taking during the first few weeks of LHC collider operations. It constitutes the first full-scale challenge with large statistics under the conditions expected at the start-up of the LHC, including the expected initial mis-alignments and mis-calibrations for each sub-detector, and event signatures and rates typical for low instantaneous luminosity. Particular emphasis has been given to the prompt reconstruction workflows, and to the procedures for the alignment and calibration of each sub-detector. The latter were performed with restricted latency using the same computing infrastructure that will be used for real data, and the resulting calibration and alignment constants were used to re-reconstruct the data at Tier-1 centres. The paper addresses the goals and practical experience from the challenge, as well as the lessons learned in view of LHC data taking.

  11. The Use of Modular Computer-Based Lessons in a Modification of the Classical Introductory Course in Organic Chemistry.

    Stotter, Philip L.; Culp, George H.

    An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…

  12. Gender Role, Gender Identity and Sexual Orientation in CAIS ("XY-Women") Compared With Subfertile and Infertile 46,XX Women.

    Brunner, Franziska; Fliegner, Maike; Krupp, Kerstin; Rall, Katharina; Brucker, Sara; Richter-Appelt, Hertha


    The perception of gender development of individuals with complete androgen insensitivity syndrome (CAIS) as unambiguously female has recently been challenged in both qualitative data and case reports of male gender identity. The aim of the mixed-method study presented was to examine the self-perception of CAIS individuals regarding different aspects of gender and to identify commonalities and differences in comparison with subfertile and infertile XX-chromosomal women with diagnoses of Mayer-Rokitansky-Küster-Hauser syndrome (MRKHS) and polycystic ovary syndrome (PCOS). The study sample comprised 11 participants with CAIS, 49 with MRKHS, and 55 with PCOS. Gender identity was assessed by means of a multidimensional instrument, which showed significant differences between the CAIS group and the XX-chromosomal women. Other-than-female gender roles and neither-female-nor-male sexes/genders were reported only by individuals with CAIS. The percentage with a not exclusively androphile sexual orientation was unexceptionally high in the CAIS group compared to the prevalence in "normative" women and the clinical groups. The findings support the assumption made by Meyer-Bahlburg ( 2010 ) that gender outcome in people with CAIS is more variable than generally stated. Parents and professionals should thus be open to courses of gender development other than typically female in individuals with CAIS. PMID:26133743

  13. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.


    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  14. Simplified computer codes for cask impact analysis

    In regard to the evaluation of the acceleration and deformation of casks, the simplified computer codes make analyses economical and decrease input and calculation time. The results obtained by the simplified computer codes have enough adequacy for their practical use. (J.P.N.)

  15. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Zagami, Jason


    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  16. Computational Intelligence in Intelligent Data Analysis

    Nürnberger, Andreas


    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  17. Transonic wing analysis using advanced computational methods

    Henne, P. A.; Hicks, R. M.


    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  18. Introduction to numerical analysis and scientific computing

    Nassif, Nabil


    Computer Number Systems and Floating Point Arithmetic Introduction Conversion from Base 10 to Base 2Conversion from Base 2 to Base 10Normalized Floating Point SystemsFloating Point OperationsComputing in a Floating Point SystemFinding Roots of Real Single-Valued Functions Introduction How to Locate the Roots of a Function The Bisection Method Newton's Method The Secant MethodSolving Systems of Linear Equations by Gaussian Elimination Mathematical Preliminaries Computer Storage for Matrices. Data Structures Back Substitution for Upper Triangular Systems Gauss Reduction LU DecompositionPolynomia

  19. Performance Analysis of Various New Technologies for Computing

    M Nagaraju; Anitha, B


    The discipline of computing is the systematic study of algorithmicprocesses that describe and transform information along with their theory, analysis, design, efficiency, implementation, and application. Application software, also known as an "application" or an "app", iscomputer softwaredesigned to help the user to perform specific tasks. Recent interests and demand in computing made new technologies to emerge in which cloud computing is the one. Cloud Computing has become another buzzword a...

  20. A Comparative Analysis of Computer Literacy Education for Nurses

    Hardin, Richard C.; Skiba, Diane J.


    Despite recent advances by nursing in the computer field computer literacy is a rarity among nursing professionals. Our analysis of existing educational models in nursing (baccalaureate, staff development, continuing education, and vendor) shows that no single educational strategy is likely to be effective in achieving computer literacy for all nurses. A refinement of the computer literacy concept is proposed which divides the educational needs of nurses into specific objectives based on desi...

  1. Modern Computational Techniques for the HMMER Sequence Analysis

    Xiandong Meng; Yanqing Ji


    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and c...

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    Bartocci, Ezio; Lió, Pietro


    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...

  3. ANACROM - A computer code for chromatogram analysis

    The computer code was developed for automatic research of peaks and evaluation of chromatogram parameters as : center, height, area, medium - height width (FWHM) and the rate FWHM/center of each peak. (Author)

  4. Behavior computing modeling, analysis, mining and decision


    Includes six case studies on behavior applications Presents new techniques for capturing behavior characteristics in social media First dedicated source of references for the theory and applications of behavior informatics and behavior computing

  5. Towed Water Turbine Computational Fluid Dynamics Analysis

    Maughan, Robert G.


    Computational fluid dynamics can be used to predict operating conditions of towed water turbines which are used in long distance sailing applications to meet electrical demands. The design consists of a turbine fastened to a shaft which is attached to a generator by a rope. The turbine is pulled in water behind a sailboat and torque is transmitted through the rope to turn the onboard generator and produce power. Torque curves from an alternator, generator, and from computational fluid dynamic...

  6. Schottky signal analysis: tune and chromaticity computation

    Chanon, Ondine


    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  7. Current status of uncertainty analysis methods for computer models

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  8. Granular computing analysis and design of intelligent systems

    Pedrycz, Witold


    Information granules, as encountered in natural language, are implicit in nature. To make them fully operational so they can be effectively used to analyze and design intelligent systems, information granules need to be made explicit. An emerging discipline, granular computing focuses on formalizing information granules and unifying them to create a coherent methodological and developmental environment for intelligent system design and analysis. Granular Computing: Analysis and Design of Intelligent Systems presents the unified principles of granular computing along with its comprehensive algo

  9. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Garrett, T. J.; And Others


    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  10. The Effectiveness of Computer-Assisted Instruction in Teaching Introductory Statistics

    Basturk, Ramazan


    The focus of this study is to demonstrate and discuss the educational advantages of Computer Assisted Instruction (CAI). A quasi-experimental design compared learning outcomes of participants in an introductory statistics course that integrated CAI to participants in a Lecture-only introductory statistics course. Reviews of participants' identical…

  11. Critical Thinking Outcomes of Computer-Assisted Instruction versus Written Nursing Process.

    Saucier, Bonnie L.; Stevens, Kathleen R.; Williams, Gail B.


    Nursing students (n=43) who used clinical case studies via computer-assisted instruction (CAI) were compared with 37 who used the written nursing process (WNP). California Critical Thinking Skills Test results did not show significant increases in critical thinking. The WNP method was more time consuming; the CAI group was more satisfied. Use of…

  12. The Effects of Trait Anxiety and Dogmatism on State Anxiety During Computer-Assisted Learning.

    Rappaport, Edward

    In this study of the interaction between anxiety trait (A-trait), anxiety state (A-state), and dogmatism in computer-assisted instruction (CAI), subjects were selected on the basis of extreme scores on a measure of anxiety and on a measure of dogmatism. The subjects were presented with a CAI task consisting of difficult mathematical problems. The…

  13. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.


    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  14. The Evolution of Instructional Design Principles for Intelligent Computer-Assisted Instruction.

    Dede, Christopher; Swigger, Kathleen


    Discusses and compares the design and development of computer assisted instruction (CAI) and intelligent computer assisted instruction (ICAI). Topics discussed include instructional systems design (ISD), artificial intelligence, authoring languages, intelligent tutoring systems (ITS), qualitative models, and emerging issues in instructional…

  15. Experience with a distributed computing system for magnetic field analysis

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  16. Analysis of airways in computed tomography

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...

  17. 地学領域における CAI の実践的研究(II) : 中・高等学校における指導例

    Hayashi, Takehiro; Tanaka, Masaki; Arita, Masashi; Suzuki, Morihisa


    On CAI in earth science education, the authors have a principle that interests of students in natural materials and phenomena must be promoted by the effective use of computer. Based upon the principle, BASIC programs for drawing the three-dimensional topographic map of western Hiroshima Prefecture are developed. Using the programs, given are earth science instructions in junior and senior high schools. Through the instructions, the students are progressed to be interested in the topography o...

  18. Cloud Computing for Rigorous Coupled-Wave Analysis

    N. L. Kazanskiy


    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  19. Wing analysis using a transonic potential flow computational method

    Henne, P. A.; Hicks, R. M.


    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  20. Computational thermo-fluid analysis of a disk brake

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu


    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  1. Computational analysis of ozonation in bubble columns

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  2. Advances in Computer-Based Autoantibodies Analysis

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  3. A computer program for PV systems analysis

    A computer program for analyzing solar cells and photovoltaic (PV) system is described. The program, called PVC, was written in visual basic for windows and aimed as a tool for studying individual cell characteristics as well as PV system as a whole. This paper describes the mathematical models used in the program, an overview of the program and its application in analyzing a BP275 PV system. (author)

  4. Computational Music Structure Analysis (Dagstuhl Seminar 16092)

    Müller, Meinard; Chew, Elaine; Bello, Juan Pablo


    Music is a ubiquitous and vital part of the lives of billions of people worldwide. Musical creations and performances are among the most complex and intricate of our cultural artifacts, and the emotional power of music can touch us in surprising and profound ways. In view of the rapid and sustained growth of digital music sharing and distribution, the development of computational methods to help users find and organize music information has become an important field of research in both indust...

  5. Analysis of computed tomography of ovarian tumor

    Omura, Makoto; Taniike, Keiko; Nishiguchi, Hiroyasu


    One hundred and twenty six patients with ovarian mass were studied with computed tomography (CT) and classified into five groups according to its margin and inner structure. The incidence of malignancy of cystic ovarian mass with smooth margin was low and that of solid ovarian mass with irreglar margin was high. Three cases (6.7 %) of malignant ovarian tumor demonstrated completely cystic pattern. Ovarian teratomas contained well defined component of fat density.

  6. Computer-aided Analysis of Phisiological Systems

    Balázs Benyó


    This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and ...

  7. Affect and Learning: a computational analysis

    Broekens, Douwe Joost


    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation to an artificial agent in a reinforcement learning setting. We have done a range of different experiments to study the effect of affect on learning, including the effect on learning if affect is us...

  8. Parameterized complexity analysis in computational biology.

    Bodlaender, H L; Downey, R G; Fellows, M R; Hallett, M T; Wareham, H T


    Many computational problems in biology involve parameters for which a small range of values cover important applications. We argue that for many problems in this setting, parameterized computational complexity rather than NP-completeness is the appropriate tool for studying apparent intractability. At issue in the theory of parameterized complexity is whether a problem can be solved in time O(n alpha) for each fixed parameter value, where alpha is a constant independent of the parameter. In addition to surveying this complexity framework, we describe a new result for the Longest Common Subsequence problem. In particular, we show that the problem is hard for W[t] for all t when parameterized by the number of strings and the size of the alphabet. Lower bounds on the complexity of this basic combinatorial problem imply lower bounds on more general sequence alignment and consensus discovery problems. We also describe a number of open problems pertaining to the parameterized complexity of problems in computational biology where small parameter values are important. PMID:7796275

  9. Soft computing techniques in voltage security analysis

    Chakraborty, Kabir


    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  10. Multivariate analysis: A statistical approach for computations

    Michu, Sachin; Kaushik, Vandana


    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  11. Computational Neural Networks: A New Paradigm for Spatial Analysis

    Fischer, M.M.


    In this paper a systematic introduction to computational neural network models is given in order to help spatial analysts learn about this exciting new field. The power of computational neural networks viz-à-viz conventional modelling is illustrated for an application field with noisy data of limited record length: spatial interaction modelling of telecommunication data in Austria. The computational appeal of neural networks for solving some fundamental spatial analysis problems is summarized...

  12. Alan Turing and the foundations of computable analysis

    Gherardi, Guido


    We investigate Turing's contributions to computability theory for real numbers and real functions presented in [22, 24, 26]. In particular, it is shown how two fundamental approaches to computable analysis, the so-called ‘Type-2 Theory of Effectivity' (TTE) and the ‘realRAM machine' model, have their foundations in Turing's work, in spite of the two incompatible notions of computability they involve. It is also shown, by contrast, how the modern conceptual tools provided by the...

  13. Computational morphology a computational geometric approach to the analysis of form

    Toussaint, GT


    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  14. The Reliability of Content Analysis of Computer Conference Communication

    Rattleff, Pernille


    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  15. The Effect of Computer Assisted Instruction on Elementary Reading and Writing Achievement

    H. Gülhan ORHAN KARSAK


    Full Text Available The research investigated the effect of computer assisted instruction (CAI on elementary reading and writing achievement (ERWA. The sample consisted of 64 first graders (32 in the experimental group and 32 in the control group in the 2006-2007 academic year. This quasi-experimental study had a posttest only control group design and was conducted during the first semester. The experimental group was taught by CAI and the control group was taught by traditional instruction. Data were gathered through ‘Parent Questionnaire’, ‘Reading Concepts Scale’, ‘Achievement Test’, ‘Reading and Handwriting Observation Form’ and analyzed by chi-square, frequency and t test through SPSS 12.0. The main findings of the study were as follows: (1 CAI affected first graders’ handwriting, reading fluency and punctuation, (2 CAI didn’t affect their writing and reading comprehension, (3 CAI affected ERWA of those who did not have any computer at home.

  16. Analysis of service-oriented computing systems

    Ivanovic, Dragan


    La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), co...

  17. Computer-aided Analysis of Phisiological Systems

    Balázs Benyó


    Full Text Available This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and stateclassification to detect brain ischemia by means of EEG signal processing; Detection ofbreathing disorders like apnea and hypopnea; Molecular biology studies with DNA-chips;Evaluation of the cry of normal hearing and hard of hearing infants.

  18. Computer-Based Interaction Analysis with DEGREE Revisited

    Barros, B.; Verdejo, M. F.


    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  19. The symbolic computation and automatic analysis of trajectories

    Grossman, Robert


    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  20. Atomic physics: computer calculations and theoretical analysis

    Drukarev, E. G.


    It is demonstrated, how the theoretical analysis preceding the numerical calculations helps to calculate the energy of the ground state of helium atom, and enables to avoid qualitative errors in the calculations of the characteristics of the double photoionization.

  1. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    Trausan-Matu, Stefan


    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  2. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  3. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Ezio Bartocci


    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  4. Process for computing geometric perturbations for probabilistic analysis

    Fitch, Simeon H. K.; Riha, David S.; Thacker, Ben H.


    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  5. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  6. VLF radio propagation conditions. Computational analysis techniques

    Complete text of publication follows. Very low frequency (VLF) radio waves propagate within the Earth-ionosphere waveguide with very little attenuation. Modifications of the waveguide geometry effect the propagation conditions, and hence, the attenuation. Changes in the ionosphere, such as the presence of the D-region during the day, or the precipitation of energetic particles, are the main causes of this modification. Using narrowband receivers monitoring VLF transmitters, the amplitude and phase of these signals are recorded. Multivariate data analysis techniques, namely Principal Component Analysis (PCA) and Singular Spectrum Analysis (SSA), are applied to the data in order to determine parameters, such as seasonal and diurnal changes, affecting the variation of these signals. Transient effects may then be easier to detect.


    Basudeb Roy Chaudhury


    Full Text Available This Experimental study compared academic performance of students in class- X (ten in one of the Bengali Medium School of rural area of Burdwan District, West Bengal , India between traditional instruction, and Computer Assisted Instruction with simultaneous discussion. The design used in this study was pre-test and post-test to control group and experiment group. Fifty students of class-x were selected and two groups were formed. Students of each group were selected randomly. Statistical data analysis was used in data analysis. Significant difference was found in the post test scores of students receiving traditional method, and CAI with simultaneous discussion. It revealed that CAI with simultaneous discussion is more effective than traditional method

  8. Computer modeling for neutron activation analysis methods

    Full text: The INP AS RU develops databases for the neutron-activation analysis - ND INAA [1] and ELEMENT [2]. Based on these databases, the automated complex is under construction aimed at modeling of methods for natural and technogenic materials analysis. It is well known, that there is a variety of analysis objects with wide spectra, different composition and concentration of elements, which makes it impossible to develop universal methods applicable for every analytical research. The modelling is based on algorithm, that counts the period of time in which the sample was irradiated in nuclear reactor, providing the sample's total absorption and activity analytical peaks areas with given errors. The analytical complex was tested for low-elemental analysis (determination of Fe and Zn in vegetation samples, and Cu, Ag and Au - in technological objects). At present, the complex is applied for multielemental analysis of sediment samples. In this work, modern achievements in the analytical chemistry (measurement facilities, high-resolution detectors, IAEA and IUPAC databases) and information technology applications (Java software, database management systems (DBMS), internet technologies) are applied. Reference: 1. Tillaev T., Umaraliev A., Gurvich L.G., Yuldasheva K., Kadirova J. Specialized database for instrumental neutron activation analysis - ND INAA 1.0, The 3-rd Eurasian Conference Nuclear Science and its applications, 2004, pp.270-271.; 2. Gurvich L.G., Tillaev T., Umaraliev A. The Information-analytical database on the element contents of natural objects. The 4-th International Conference Modern problems of Nuclear Physics, Samarkand, 2003, p.337. (authors)


    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. Limited subsolidus diffusion in type B1 CAI: Evidence from Ti distribution in spinel

    Meeker, G. P.; Quick, J. E.; Paque, Julie M.


    Most models of calcium aluminum-rich inclusions (CAI) have focused on early stages of formation by equilibrium crystallization of a homogeneous liquid. Less is known about the subsolidus cooling history of CAI. Chemical and isotopic heterogeneties on a scale of tens to hundreds of micrometers (e.g. MacPherson et al. (1989) and Podosek, et al. (1991)) suggest fairly rapid cooling with a minimum of subsolidus diffusion. However, transmission electron microscopy indicates that solid state diffusion may have been an important process at a smaller scale (Barber et al. 1984). If so, chemical evidence for diffusion could provide constraints on cooling times and temperatures. With this in mind, we have begun an investigation of the Ti distribution in spinels from two type B1 CAI from Allende to determine if post-crystallization diffusion was a significant process. The type B1 CAIs, 3529Z and 5241 have been described by Podosek et al. (1991) and by El Goresy et al. (1985) and MacPherson et al. (1989). We have analyzed spinels in these inclusions using the electron microprobe. These spinels are generally euhedral, range in size from less than 10 to 15 micron and are poikilitically enclosed by millimeter-sized pyroxene, melilite, and anorthite. Analyses were obtained from both the mantles and cores of the inclusions. Compositions of pyroxene in the vicinity of individual spinel grains were obtained by analyzing at least two points on opposite sides of the spinel and averaging the compositions. The pyroxene analyses were obtained within 15 microns of the spinel-pyroxene interface. No compositional gradients were observed within single spinel crystals. Ti concentrations in spinels included within pyroxene, melilite, and anorthite are presented.

  11. Development of Computer Science Disciplines - A Social Network Analysis Approach

    Pham, Manh Cuong; Jarke, Matthias


    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...



    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  13. Computational analysis of thresholds for magnetophosphenes

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m−2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  14. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    Braun, Klaus; Frank, Martin; Pipkorn, Rüdiger; Reed, Jennifer; Spring, Herbert; Debus, Jürgen; Didinger, Bernd; von der Lieth, Claus-Wilhelm; Wiessler, Manfred; Waldeck, Waldemar


    The Human immunodeficiency virus 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal las...

  15. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    Klaus Braun, Martin Frank, Rüdiger Pipkorn, Jennifer Reed, Herbert Spring, Jürgen Debus, Bernd Didinger, Claus-Wilhelm von der Lieth, Manfred Wiessler, Waldemar Waldeck


    The Human immunodeficiency 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser sca...

  16. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil)

    Alves Rômulo RN; Hanazaki Natalia; Begossi Alpina


    Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers), with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 nati...

  17. Adaptive computational methods for aerothermal heating analysis

    Price, John M.; Oden, J. Tinsley


    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  18. Local spatial frequency analysis for computer vision

    Krumm, John; Shafer, Steven A.


    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  19. Interactive Computer Lessons for Introductory Economics: Guided Inquiry-From Supply and Demand to Women in the Economy.

    Miller, John; Weil, Gordon


    The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)

  20. Computer-Assisted Education System for Psychopharmacology.

    McDougall, William Donald

    An approach to the use of computer assisted instruction (CAI) for teaching psychopharmacology is presented. A project is described in which, using the TUTOR programing language on the PLATO IV computer system, several computer programs were developed to demonstrate the concepts of aminergic transmitters in the central nervous system. Response…

  1. Crystal structures of hydrates of simple inorganic salts. II. Water-rich calcium bromide and iodide hydrates: CaBr2 · 9H2O, CaI2 · 8H2O, CaI2 · 7H2O and CaI2 · 6.5H2O.

    Hennings, Erik; Schmidt, Horst; Voigt, Wolfgang


    Single crystals of calcium bromide enneahydrate, CaBr(2) · 9H2O, calcium iodide octahydrate, CaI(2) · 8H2O, calcium iodide heptahydrate, CaI(2) · 7H2O, and calcium iodide 6.5-hydrate, CaI(2) · 6.5H2O, were grown from their aqueous solutions at and below room temperature according to the solid-liquid phase diagram. The crystal structure of CaI(2) · 6.5H2O was redetermined. All four structures are built up from distorted Ca(H2O)8 antiprisms. The antiprisms of the iodide hydrate structures are connected either via trigonal-plane-sharing or edge-sharing, forming dimeric units. The antiprisms in calcium bromide enneahydrate are monomeric. PMID:25186361


    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.


    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang


    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  4. Computer-aided safety analysis of computer-controlled systems : a case example

    Biegert, Uwe


    Computer controlled systems consist of a complex interaction between technical process, human task and software. For the development of safety critical systems new method are required, which not only consider one of these parts of a computer-controlled system. In this paper a qualitative modeling method is presented. The method is called SQMA, Situationbased Qualitative Modeling and Analysis and it origin goes back to Qualitative Reasoning. First, all parts of a system are modeled separated a...

  5. Oxygen isotopes in the early protoplanetary disk inferred from pyroxene in a classical type B CAI

    Aléon, Jérôme


    A major unanswered question in solar system formation is the origin of the oxygen isotopic dichotomy between the Sun and the planets. Individual Calcium-Aluminum-rich inclusions (CAIs) from CV chondrites exhibit almost the full isotopic range, but how their composition evolved is still unclear, which prevents robust astrochemical conclusions. A key issue is notably the yet unsolved origin of the 16O-rich isotopic composition of pyroxene in type B CAIs. Here, I report an in-situ oxygen isotope study of the archetypal type B CAI USNM-3529-Z from Allende with emphasis on the isotopic composition of pyroxene and its isotopic and petrographic relationships with other major minerals. The O isotopic composition of pyroxene is correlated with indicators of magmatic growth, indicating that the pyroxene evolved from a 16O-poor composition and became progressively enriched in 16O during its crystallization, contrary to the long held assumption that pyroxene was initially 16O-rich. This variation is well explained by isotopic exchange between a 16O-poor partial melt having the isotopic composition of melilite and a 16O-rich gas having the isotopic composition of spinel, during pyroxene crystallization. The isotopic evolution of 3529-Z is consistent with formation in an initially 16O-rich environment where spinel and gehlenitic melilite crystallized, followed by a 16O-depletion associated with melilite partial melting and recrystallization and finally a return to the initial 16O-rich environment before pyroxene crystallization. This strongly suggests that the environment of CAI formation was globally 16O-rich, with local 16O-depletions systematically associated with high temperature events. The Al/Mg isotopic systematics of 3529-Z further indicates that this suite of isotopic changes occurred in the first 150 000 yr of the solar system, during the main CAI formation period. A new astrophysical setting is proposed, where the 16O-depletion occurs in an optically thin surface

  6. Conference “Computational Analysis and Optimization” (CAO 2011)

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday


    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  7. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  8. A Computational Discriminability Analysis on Twin Fingerprints

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.


    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...


    I. Fisk


    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. Computer programs for analysis of geophysical data

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  12. Structural Analysis Using Computer Based Methods

    Dietz, Matthew R.


    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  13. Computer aided information system for a PWR

    The computer aided information system (CAIS) is designed with a view to improve the performance of the operator. CAIS assists the plant operator in an advisory and support role, thereby reducing the workload level and potential human errors. The CAIS as explained here has been designed for a PWR type KLT- 40 used in Floating Nuclear Power Stations (FNPS). However the underlying philosophy evolved in designing the CAIS can be suitably adopted for other type of nuclear power plants too (BWR, PHWR). Operator information is divided into three broad categories: a) continuously available information b) automatically available information and c) on demand information. Two in number touch screens are provided on the main control panel. One is earmarked for continuously available information and the other is dedicated for automatically available information. Both the screens can be used at the operator's discretion for on-demand information. Automatically available information screen overrides the on-demand information screens. In addition to the above, CAIS has the features of event sequence recording, disturbance recording and information documentation. CAIS design ensures that the operator is not overburdened with excess and unnecessary information, but at the same time adequate and well formatted information is available. (author). 5 refs., 4 figs

  14. Sensitivity analysis for computational models of biochemical systems



    Systems biology is an integrated area of science which aims at the analysis of biochemical systems using an holistic perspective. In this context, sensitivity analysis, a technique studying how the output variation of a computational model can be associated to its input state plays a pivotal role. In the thesis it is described how to properly apply the different sensitivity analysis techniques according to the specific case study (i.e., continuous deterministic rather than discrete stochastic...

  15. Benefits of Computer Based Content Analysis to Foresight

    Kováříková, Ludmila; Grosová, Stanislava


    Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of the content analysis for the foresight projects are discussed as well. Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis within the foresight, results of the generation phase of a particular foresight project perf...

  16. Analysis of the Naval Postgraduate School computer network architecture

    Wiedenhoeft, Paul Eric


    The computer network on the Naval Postgraduate School campus has become an integral part of the operations of the Naval Postgraduate School organization. An analysis of the network architecture will help formulate strategic plans that will support the network and the Naval Postgraduate School to the end of the century. This study describes the Naval Postgraduate School computer network architecture, driving forces, imitations, and possible measures of network benefits. It considers network al...

  17. Computer-based modelling and analysis in engineering geology

    Giles, David


    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  18. Strategic Analysis of Autodesk and the Move to Cloud Computing

    Kewley, Kathleen


    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  19. Numeric computation and statistical data analysis on the Java platform

    Chekanov, Sergei V


    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  20. Parallel computation of seismic analysis of high arch dam

    Chen Houqun; Ma Huaifa; Tu Jin; Cheng Guangqing; Tang Juzhen


    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.


    Krot, A N; Chaussidon, M; Yurimoto, H; Sakamoto, N; Nagashima, K; Hutcheon, I D; MacPherson, G J


    Based on the mineralogy and petrography, coarse-grained, igneous, anorthite-rich (Type C) calcium-aluminum-rich inclusions (CAIs) in the CV3 carbonaceous chondrite Allende have been recently divided into three groups: (i) CAIs with melilite and Al,Ti-diopside of massive and lacy textures (coarse grains with numerous rounded inclusions of anorthite) in a fine-grained anorthite groundmass (6-1-72, 100, 160), (ii) CAI CG5 with massive melilite, Al,Ti-diopside and anorthite, and (iii) CAIs associated with chondrule material: either containing chondrule fragments in their peripheries (ABC, TS26) or surrounded by chondrule-like, igneous rims (93) (Krot et al., 2007a,b). Here, we report in situ oxygen isotopic measurements of primary (melilite, spinel, Al,Ti-diopside, anorthite) and secondary (grossular, monticellite, forsterite) minerals in these CAIs. Spinel ({Delta}{sup 17}O = -25{per_thousand} to -20{per_thousand}), massive and lacy Al,Ti-diopside ({Delta}{sup 17}O = -20{per_thousand} to -5{per_thousand}) and fine-grained anorthite ({Delta}{sup 17}O = -15{per_thousand} to -2{per_thousand}) in 100, 160 and 6-1-72 are {sup 16}O-enriched relative spinel and coarse-grained Al,Ti-diopside and anorthite in ABC, 93 and TS26 ({Delta}{sup 17}O ranges from -20{per_thousand} to -15{per_thousand}, from -15{per_thousand} to -5{per_thousand}, and from -5{per_thousand} to 0{per_thousand}, respectively). In 6-1-72, massive and lacy Al,Ti-diopside grains are {sup 16}O-depleted ({Delta}{sup 17}O {approx} -13{per_thousand}) relative to spinel ({Delta}{sup 17}O = -23{per_thousand}). Melilite is the most {sup 16}O-depleted mineral in all Allende Type C CAIs. In CAI 100, melilite and secondary grossular, monticellite and forsterite (minerals replacing melilite) are similarly {sup 16}O-depleted, whereas grossular in CAI 160 is {sup 16}O-enriched ({Delta}{sup 17}O = -10{per_thousand} to -6{per_thousand}) relative to melilite ({Delta}{sup 17}O = -5{per_thousand} to -3{per_thousand}). We infer

  2. Computer aided plant engineering: An analysis and suggestions for computer use

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.)

  3. Computer-Assisted Learning Design for Reflective Practice Supporting Multiple Learning Styles for Education and Training in Pre-Hospital Emergency Care.

    Jones, Indra; Cookson, John


    Students in paramedic education used a model combining computer-assisted instruction (CAI), reflective practice, and learning styles. Although reflective practice normally requires teacher-student interaction, CAI with reflective practice embedded enabled students to develop learning style competencies and achieve curricular outcomes. (SK)

  4. Computer vision approaches to medical image analysis. Revised papers

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  5. Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection

    Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok


    Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.

  6. Computational Analysis of the SRS Phase III Salt Disposition Alternatives

    Completion of the Phase III evaluation and comparison of salt disposition alternatives was supported with enhanced computer models and analysis for each case on the ''short list'' of four options. SPEEDUP(TM) models and special purpose models describing mass and energy balances and flow rates were developed and used to predict performance and production characteristics for each of the options. Results from the computational analysis were a key part of the input used to select a primary and an alternate salt disposition alternative

  7. Computer analysis of failures in nuclear power station

    Computer analysis of minor failures at nuclear power plants have been carried out in the Institute for Electrical Power Research (VEIKI) since 1976. The research work was mainly directed to the computer based application of methods to be used at the Paks Nuclear Power Plant; the proposed procedures, however, can also be used at traditional power plants. The paper describes the general aims and main steps of failure analysis and summarizes the state of the art and perspectives of R and D in Hungary. (N.I.)

  8. Advances in computational design and analysis of airbreathing propulsion systems

    Klineberg, John M.


    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  9. Superfast robust digital image correlation analysis with parallel computing

    Pan, Bing; Tian, Long


    Existing digital image correlation (DIC) using the robust reliability-guided displacement tracking (RGDT) strategy for full-field displacement measurement is a path-dependent process that can only be executed sequentially. This path-dependent tracking strategy not only limits the potential of DIC for further improvement of its computational efficiency but also wastes the parallel computing power of modern computers with multicore processors. To maintain the robustness of the existing RGDT strategy and to overcome its deficiency, an improved RGDT strategy using a two-section tracking scheme is proposed. In the improved RGDT strategy, the calculated points with correlation coefficients higher than a preset threshold are all taken as reliably computed points and given the same priority to extend the correlation analysis to their neighbors. Thus, DIC calculation is first executed in parallel at multiple points by separate independent threads. Then for the few calculated points with correlation coefficients smaller than the threshold, DIC analysis using existing RGDT strategy is adopted. Benefiting from the improved RGDT strategy and the multithread computing, superfast DIC analysis can be accomplished without sacrificing its robustness and accuracy. Experimental results show that the presented parallel DIC method performed on a common eight-core laptop can achieve about a 7 times speedup.

  10. Integration of rocket turbine design and analysis through computer graphics

    Hsu, Wayne; Boynton, Jim


    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  11. Computer analysis of thermal hydraulics for nuclear reactor safety

    This paper gives an overview of ANSTO's capability and recent research and development activities in thermal hydraulic modelling for nuclear reactor safety analysis, particularly for our research reactor, HIFAR (High Flux Australian Reactor) and its intended replacement, the Replacement Research Reactor (RRR). Several tools contribute to ANSTO's capability in thermal hydraulic modelling, including RELAP (developed in US) - a code for reactor system thermal-hydraulic analysis; CFS (developed in UK) - a general computational fluid dynamics code , which was used for thermal hydraulic analysis in reactor fuel elements; and HIZAPP (developed at ANSTO) - for coupling neutronics with thermal-hydraulics for reactor transient analysis

  12. Analysis of the computed tomography in the acute abdomen

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  13. Convergence Analysis of a Class of Computational Intelligence Approaches

    Junfeng Chen


    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  14. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Amann, Rupert P; Waberski, Dagmar


    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in process data apparently are available. PMID:24274405

  15. Brain Computer Interface Enhancement by Independent Component Analysis

    Bobrov, P.; Frolov, A. A.; Húsek, Dušan

    Heidelberg: Springer, 2013 - (Kudělka, M.; Pokorný, J.; Snášel, V.; Abraham, A.), s. 51-60. (Advances in Intelligent Systems and Computing. 179). ISBN 978-3-642-31602-9. ISSN 2194-5357. [IHCI 2011. International Conference on Intelligent Human Computer Interaction /3./. Prague (CZ), 29.08.2011-31.08.2011] R&D Projects: GA ČR GAP202/10/0262; GA ČR GA205/09/1079 Grant ostatní: GA MŠk(CZ) ED1.1.00/02.0070 Institutional research plan: CEZ:AV0Z10300504 Keywords : brain computer interface * EEG patterns classiffication * independent component analysis * classification accuracy * m-rythm identification Subject RIV: IN - Informatics, Computer Science

  16. Computational mathematics models, methods, and analysis with Matlab and MPI

    White, Robert E


    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  17. Problem Solving Process Research of Everyone Involved in Innovation Based on CAI Technology

    Chen, Tao; Shao, Yunfei; Tang, Xiaowo

    It is very important that non-technical department personnel especially bottom line employee serve as innovators under the requirements of everyone involved in innovation. According the view of this paper, it is feasible and necessary to build everyone involved in innovation problem solving process under Total Innovation Management (TIM) based on the Theory of Inventive Problem Solving (TRIZ). The tools under the CAI technology: How TO mode and science effects database could be very useful for all employee especially non-technical department and bottom line for innovation. The problem solving process put forward in the paper focus on non-technical department personnel especially bottom line employee for innovation.

  18. The Clinical Experiences of Dr.CAI Gan in Treating Chronic Constipation

    ZHANG Zheng-li; ZHU Mei-ping; LIU Qun; LEI Yun-xia


    @@ Prof.CAI Gan (蔡淦) is an academic leader in TCM treatment of the spleen and stomach disease.He insisted that liver depression, spleen deficiency and poor nourishment of the intestines are the core of pathogenesis for chronic constipation.Therefore he often treats the disease by strengthening the spleen,relieving the depressed liver, nourishing yin and moistening the intestines.Meanwhile he attaches great importance to syndrome differentiation and comprehensive regulation and treatment.As a result,good therapeutic effects are often achieved.The authors summarized his ways for treating chronic constipation with the following 10 methods, which are introduced below.

  19. Integrating computer programs for engineering analysis and design

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.


    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  20. Sentiment analysis and ontology engineering an environment of computational intelligence

    Chen, Shyi-Ming


    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  1. Practical computer analysis of switch mode power supplies

    Bennett, Johnny C


    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  2. Computer codes for safety analysis of Indian PHWRs

    Computer codes for safety analysis of PHWRs have been developed in India over the years. Some of the codes that have been developed in NPC are discussed in this paper. Computer code THYNAC and ATMIKA have been developed in NPC for the analysis of LOCA scenario. Both the codes are based on UVET model using three equations and slip correlations. Computer code ATMIKA is an improved version of code THYNAC with regard to numerics and flexibility in modelling. Apart from thermal hydraulic model these codes also include point neutron kinetics model. Codes COOLTMP and RCOMP are used to estimate heat-up of primary coolant and core components respectively under off-normal shutdown conditions as may be existing during special maintenance job or postulated failure. Code validations have been performed either against experiments or the published results of experiments performed elsewhere, or through International benchmark exercises sponsored by IAEA. The paper discusses these codes, their validations and salient applications

  3. Finite element dynamic analysis on CDC STAR-100 computer

    Noor, A. K.; Lambiotte, J. J., Jr.


    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  4. Interactive computer code for dynamic and soil structure interaction analysis

    Mulliken, J.S.


    A new interactive computer code is presented in this paper for dynamic and soil-structure interaction (SSI) analyses. The computer program FETA (Finite Element Transient Analysis) is a self contained interactive graphics environment for IBM-PC`s that is used for the development of structural and soil models as well as post-processing dynamic analysis output. Full 3-D isometric views of the soil-structure system, animation of displacements, frequency and time domain responses at nodes, and response spectra are all graphically available simply by pointing and clicking with a mouse. FETA`s finite element solver performs 2-D and 3-D frequency and time domain soil-structure interaction analyses. The solver can be directly accessed from the graphical interface on a PC, or run on a number of other computer platforms.

  5. Qualitative Research and Computer Analysis: New Challenges and Opportunities

    Yuen, AHK


    The use of computers for Qualitative Data Analysis (QDA) in qualitative research has been growing rapidly in the last decade. QDA programs are software packages developed explicitly for the purpose of analyzing qualitative data. A range of different kinds of program is available for the handling and analysis of qualitative data, such as Atlas/ti, HyperRESEARCH, and NUD*IST. With the development of new technologies, the QDA software has advanced from the efficient code-and-retrieve ability to ...

  6. Computer automated movement detection for the analysis of behavior

    Ramazani, Roseanna B.; Harish R Krishnan; BERGESON, SUSAN E.; Atkinson, Nigel S.


    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtractio...

  7. Computational Methods for the Analysis of Array Comparative Genomic Hybridization

    Raj Chari


    Full Text Available Array comparative genomic hybridization (array CGH is a technique for assaying the copy number status of cancer genomes. The widespread use of this technology has lead to a rapid accumulation of high throughput data, which in turn has prompted the development of computational strategies for the analysis of array CGH data. Here we explain the principles behind array image processing, data visualization and genomic profile analysis, review currently available software packages, and raise considerations for future software development.

  8. Computational models for the nonlinear analysis of reinforced concrete plates

    Hinton, E.; Rahman, H. H. A.; Huq, M. M.


    A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.

  9. Numerical methods design, analysis, and computer implementation of algorithms

    Greenbaum, Anne


    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  10. Computer system for environmental sample analysis and data storage and analysis

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discusses the structure, performance and applications of the system