WorldWideScience

Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Energy Technology Data Exchange (ETDEWEB)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Computer Assisted Instruction (CAI) in Language Teaching

    Institute of Scientific and Technical Information of China (English)

    Xin; Jing

    2015-01-01

    There are many ways to use computers for English language teaching.First of all,teachers can use them to prepare for classes.They can use a word processing program to write teaching materials and tests.They can use dictionaries,encyclopedias,et c.,available on the computer as resources to help them prepare

  3. Curriculum planning and computer-assisted instruction (CAI) within clinical nursing education.

    OpenAIRE

    Perciful, E. G.

    1992-01-01

    Some experts in nursing and computers have stated that the integration of the computer within nursing education needs to be planned. It has also been declared that there is a need for a body of knowledge that describes the planning and implementing of CAI and the degree of success with the implementation of CAI within nursing education. There is a paucity of literature addressing the planning, implementing, and evaluation of CAI within clinical nursing education. The purpose of this paper is ...

  4. The Effect of the Computer Assisted Instruction (CAI on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    Directory of Open Access Journals (Sweden)

    Tuğba Hangül

    2010-12-01

    Full Text Available The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade students who were randomly identified in the year of 2009-2010. Attitude was applied to the both groups before and at the end of teaching. The method of constructivism was applied to control the group while CAI was applied to the experiment group. After teaching, fourteen students who were randomly selected from the experimental group were interviewed. Quantitative data was analyzed using Independent Samples t-test and qualitative data was analyzed by description analyze. At the end of the study, the data put forward that teaching through CAI improves the students’ attitudes positively than the method of Constructivism and students have positive opinions on CAI.

  5. An investigative study into the effectiveness of using computer-aided instruction (CAI) as a laboratory component of college-level biology: A case study

    Science.gov (United States)

    Barrett, Joan Beverly

    Community colleges serve the most diverse student populations in higher education. They consist of non-traditional, part-time, older, intermittent, and mobile students of different races, ethnic backgrounds, language preferences, physical and mental abilities, and learning style preferences. Students who are academically challenged may have diverse learning characteristics that are not compatible with the more traditional approaches to the delivery of instruction. With this need come new ways of solving the dilemma, such as Computer-aided Instruction (CAI). This case study investigated the use of CAI as a laboratory component of college-level biology in a small, rural community college setting. The intent was to begin to fill a void that seems to exist in the literature regarding the role of the faculty in the development and use of CAI. In particular, the investigator was seeking to understand the practice and its effectiveness, especially in helping the under prepared student. The case study approach was chosen to examine a specific phenomenon within a single institution. Ethnographic techniques, such as interviewing, documentary analysis, life's experiences, and participant observations were used to collect data about the phenomena being studied. Results showed that the faculty was primarily self-motivated and self-taught in their use of CAI as a teaching and learning tool. The importance of faculty leadership and collegiality was evident. Findings showed the faculty confident that expectations of helping students who have difficulties with mathematical concepts have been met and that CAI is becoming the most valuable of learning tools. In a traditional college classroom, or practice, time is the constant (semesters) and competence is the variable. In the CAI laboratory time became the variable and competence the constant. The use of CAI also eliminated hazardous chemicals that were routinely used in the more traditional lab. Outcomes showed that annual savings

  6. The Effect of the Computer Assisted Instruction (CAI) on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    OpenAIRE

    Tuğba Hangül; Devrim Uzel

    2010-01-01

    The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI) and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade s...

  7. THERMAL HISTORY OF THE CARNIC ALPS (NE ITALY-S. AUSTRIA USING CAI ANALYSIS

    Directory of Open Access Journals (Sweden)

    MONICA PONDRELLI

    2002-11-01

    Full Text Available Thermal patterns of an area which underwent a polyphase deformation history such as the Carnic Alps were analyzed using the Colour Alteration Index (CAI of conodonts in order to constrain some aspects of the metamorphic history of this part of the Southern Alps. Hercynian and alpine tectonothermal events were distinguished using CAI analysis.  The Hercynian event developed temperatures up to low metamorphic conditions. Alpine tectonogenesis did not produce thermal levels in excess of the diagenetic zone. Moreover, CAI patterns allow recognition and evaluation of a hydrothermal metamorphic overprint of Permo-Triassic or Oligocene age that was superimposed on the pre-existing regional metamorphic zonation.   

  8. Personality preference influences medical student use of specific computer-aided instruction (CAI

    Directory of Open Access Journals (Sweden)

    Halsey Martha

    2006-02-01

    Full Text Available Abstract Background The objective of this study was to test the hypothesis that personality preference, which can be related to learning style, influences individual utilization of CAI applications developed specifically for the undergraduate medical curriculum. Methods Personality preferences of students were obtained using the Myers-Briggs Type Indicator (MBTI test. CAI utilization for individual students was collected from entry logs for two different web-based applications (a discussion forum and a tutorial used in the basic science course on human anatomy. Individual login data were sorted by personality preference and the data statistically analyzed by 2-way mixed ANOVA and correlation. Results There was a wide discrepancy in the level and pattern of student use of both CAI. Although individual use of both CAI was positively correlated irrespective of MBTI preference, students with a "Sensing" preference tended to use both CAI applications more than the "iNtuitives". Differences in the level of use of these CAI applications (i.e., higher use of discussion forum vs. a tutorial were also found for the "Perceiving/Judging" dimension. Conclusion We conclude that personality/learning preferences of individual students influence their use of CAI in the medical curriculum.

  9. The Vibrio cholerae quorum-sensing autoinducer CAI-1: analysis of the biosynthetic enzyme CqsA

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, R.; Bolitho, M; Higgins, D; Lu, W; Ng, W; Jeffrey, P; Rabinowitz, J; Semmelhack, M; Hughson, F; Bassler, B

    2009-01-01

    Vibrio cholerae, the bacterium that causes the disease cholera, controls virulence factor production and biofilm development in response to two extracellular quorum-sensing molecules, called autoinducers. The strongest autoinducer, called CAI-1 (for cholera autoinducer-1), was previously identified as (S)-3-hydroxytridecan-4-one. Biosynthesis of CAI-1 requires the enzyme CqsA. Here, we determine the CqsA reaction mechanism, identify the CqsA substrates as (S)-2-aminobutyrate and decanoyl coenzyme A, and demonstrate that the product of the reaction is 3-aminotridecan-4-one, dubbed amino-CAI-1. CqsA produces amino-CAI-1 by a pyridoxal phosphate-dependent acyl-CoA transferase reaction. Amino-CAI-1 is converted to CAI-1 in a subsequent step via a CqsA-independent mechanism. Consistent with this, we find cells release {ge}100 times more CAI-1 than amino-CAI-1. Nonetheless, V. cholerae responds to amino-CAI-1 as well as CAI-1, whereas other CAI-1 variants do not elicit a quorum-sensing response. Thus, both CAI-1 and amino-CAI-1 have potential as lead molecules in the development of an anticholera treatment.

  10. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    OpenAIRE

    Hsin-Yih Shyu

    1995-01-01

    無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI) courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation) following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI cours...

  11. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Directory of Open Access Journals (Sweden)

    Hsin-Yih Shyu

    1995-09-01

    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  12. In Situ Trace Element Analysis of an Allende Type B1 CAI: EK-459-5-1

    Science.gov (United States)

    Jeffcoat, C. R.; Kerekgyarto, A.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Variations in refractory major and trace element composition of calcium, aluminum-rich inclusions (CAIs) provide constraints on physical and chemical conditions and processes in the earliest stages of the Solar System. Previous work indicates that CAIs have experienced complex histories involving, in many cases, multiple episodes of condensation, evaporation, and partial melting. We have analyzed major and trace element abundances in two core to rim transects of the melilite mantle as well as interior major phases of a Type B1 CAI (EK-459-5-1) from Allende by electron probe micro-analyzer (EPMA) and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) to investigate the behavior of key trace elements with a primary focus on the REEs Tm and Yb.

  13. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Directory of Open Access Journals (Sweden)

    Sunny S. J. Lin

    1988-03-01

    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  14. Computer Series, 25.

    Science.gov (United States)

    Moore, John W., Ed.

    1982-01-01

    Nine computer programs (available from the authors) are described including graphic display of molecular structures from crystallographic data, computer assisted instruction (CAI) with MATH subroutine, CAI preparation-for-chemistry course, calculation of statistical thermodynamic properties, qualitative analysis program, automated conductimetric…

  15. The Relevance of AI Research to CAI.

    Science.gov (United States)

    Kearsley, Greg P.

    This article provides a tutorial introduction to Artificial Intelligence (AI) research for those involved in Computer Assisted Instruction (CAI). The general theme is that much of the current work in AI, particularly in the areas of natural language understanding systems, rule induction, programming languages, and socratic systems, has important…

  16. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Natural gas diffusion through the cap rock is mainly by means ofdissolving in water, so its concentration can be replaced by solubility, which varies with temperature, pressure and salinity in strata. Under certain geological conditions the maximal solubility is definite, so the diffusion com-putation can be handled approximately by stable state equation. Furthermore, on the basis of the restoration of the paleo-buried history, the diffusion is calculated with the dynamic method, and the result is very close to the real diffusion value in the geological history.

  17. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    FANG; Dequan; (

    2001-01-01

    preferred orientation of experimentally deformed quartzites, Geol. Soc. Am. Bull., 1973, 8: 297.[13]Ramsay, J. G., Huber, M., The Techniques of Modern Structural Geology, Vol. 1, Strain Analysis, New York: Academic Press, 1983, 73-124.[14]Li Shuguang, Ge Ningjie, Liu Deliang et al., The Sm-Nd isotopic age of C-type eclogite from the Dabie group in the northern Dabie mountains and its tectonic implication, Chinese Science Bulletin, 1989, 34(19): 1625.[15]Ye Bodan, Jian Ping, Xu Junwen et al., The Sujiahe Terrene Collage Belt and Its Constitution and Evolution Along the North Hillslope of the Tongbai-Dabie Orogenic Belt (in Chinese), Wuhan: China University of Geosciences Press, 1993, 1-69.[16]Jian Ping, Yan Weiran, Li Zhchang et al., Isotopic geochronological evidence for the Caledonian Xiongdian eclogite in the western Dabie mountains, China, Acta Geologica Sinica (in Chinese), 1997, 71(2): 133.[17]Liu Zhigang, Niu Baogui, Fu Yunlian et al., The tectonostratigraphic units at the northern foot of the Dabie mountains, Regional Geology of China (in Chinese), 1994, 13(1): 246.[18]Zhai Xiaoming, Day, H. W., Hacker, B. R. et al., Paleozoic metamorphism in the Qinling orogen, Tongbai Mountain, central China, Geology, 1998, 26: 371.[19]Li, S., Jagoutz., E., Xiao, Y. et al., Chronology of ultrahigh-pressure metamorphism in the Dabie Mountains and Su-Lu terrene: I. Sm-Nd isotope system, Science in China, Ser. D, 1996, 39(6): 597.[20]Zhang, Z., You, Z., Han, Y. et al., Petrology metamorphic process and genesis of the Dabie-Sulu eclogite belt, east-central China, Acta Geologica Sinica, 1995, 96(2): 306.[21]Cong Bolin, Wang Qingchen, The Dabie-Sulu UHP rocks belt: review and prospect, Chinese Science Bulletin, 1999, 44(12): 1074.[22]Xu Shutong, Jiang laili, Liu Yican et al., Tectonic framework and evolution of the Dabie mountains in Anhui, eastern China, Acta Geologica Sinica (in Chinese), 1992, 66(1): 1.[23]Ren Jishun, Niu Baogui, Liu Zhigang

  18. Maxi CAI with a Micro.

    Science.gov (United States)

    Gerhold, George; And Others

    This paper describes an effective microprocessor-based CAI system which has been repeatedly tested by a large number of students and edited accordingly. Tasks not suitable for microprocessor based systems (authoring, testing, and debugging) were handled on larger multi-terminal systems. This approach requires that the CAI language used on the…

  19. A study on VR-based mutual adaptive CAI system for nuclear power plant

    International Nuclear Information System (INIS)

    A novel framework of human-computer-interaction for computer aided instruction (CAI) system is presented which aims at introducing a new off-the-job training environment to master nuclear power plant monitoring skill by more user-friendly manner than by present. The framework is based on the following two new ideas: one is mutual adaptive interface (MADI) concept, and the other is virtual realty (VR). In order to realize a hardware mechanism of mutual adaptive interface based on VR, a new head-mounted-display (HMD) was developed which can not only provide the user with virtual environment conventionally but also detect the user's eyes images for in-situ analysis of various ocular information. The information are expected to utilize for realizing advanced human-computer-interaction in the CAI system

  20. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  1. Computer Programming Job Analysis

    OpenAIRE

    Debdulal Dutta Roy

    2002-01-01

    This study investigated relative uses of computer programming job characteristics across different organizations and effects of different demographic variables on job analysis ratings. Data were collected from 201 computer programers of 6 different organizations through checklist. Principal component analysis noted four mostly used job characteristics as program writing and testing, human relations, data analysis and user satisfaction. Of them only data analysis differed among different organ...

  2. Predicting low velocity impact damage and Compression-After-Impact (CAI) behaviour of composite laminates

    OpenAIRE

    Tan, Wei; Falzon, Brian G.; Chiu, Louis N S; Price, Mark

    2015-01-01

    Low-velocity impact damage can drastically reduce the residual strength of a composite structure even when the damage is barely visible. The ability to computationally predict the extent of damage and compression-after-impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant time and cost penalties. A high-fidelity three-dimensional composite damage model, to predict both low-velocity impact damage and CAI st...

  3. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this...... well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  4. Computer aided safety analysis

    International Nuclear Information System (INIS)

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  5. CAIs in Semarkona (LL3.0)

    Science.gov (United States)

    Mishra, R. K.; Simon, J. I.; Ross, D. K.; Marhas, K. K.

    2016-01-01

    Calcium, Aluminum-rich inclusions (CAIs) are the first forming solids of the Solar system. Their observed abundance, mean size, and mineralogy vary quite significantly between different groups of chondrites. These differences may reflect the dynamics and distinct cosmochemical conditions present in the region(s) of the protoplanetary disk from which each type likely accreted. Only about 11 such objects have been found in L and LL type while another 57 have been found in H type ordinary chondrites, compared to thousands in carbonaceous chondrites. At issue is whether the rare CAIs contained in ordinary chondrites truly reflect a distinct population from the inclusions commonly found in other chondrite types. Semarkona (LL3.00) (fall, 691 g) is the most pristine chondrite available in our meteorite collection. Here we report petrography and mineralogy of 3 CAIs from Semarkona

  6. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  7. Computational Music Analysis

    OpenAIRE

    2016-01-01

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern r...

  8. A Unified Framework for Producing CAI Melting, Wark-Lovering Rims and Bowl-Shaped CAIs

    CERN Document Server

    Liffman, Kurt; Paterson, David A

    2016-01-01

    Calcium Aluminium Inclusions (CAIs) formed in the Solar System, some 4,567 million years ago. CAIs are almost always surrounded by Wark-Lovering Rims (WLRs), which are a sequence of thin, mono/bi-mineralic layers of refractory minerals, with a total thickness in the range of 1 to 100 microns. Recently, some CAIs have been found that have tektite-like bowl-shapes. To form such shapes, the CAI must have travelled through a rarefied gas at hypersonic speeds. We show how CAIs may have been ejected from the inner solar accretion disc via the centrifugal interaction between the solar magnetosphere and the inner disc rim. They subsequently punched through the hot, inner disc rim wall at hypersonic speeds. This re-entry heating partially or completely evaporated the CAIs. Such evaporation could have significantly increased the metal abundances of the inner disc rim. High speed movement through the inner disc produced WLRs. To match the observed thickness of WLRs required metal abundances at the inner disc wall that a...

  9. Shielding Benchmark Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-09-17

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC).

  10. A Study on Application of CAI Dynamic Image-Guided Method in College Physical Education Technical Course

    OpenAIRE

    Baokui Wang

    2013-01-01

    In this study, we have a study on application of CAI dynamic image-guided method in college physical education technical course. In college physical education teaching, the Computer-Assisted Instruction (CAI) dynamic image-guided method is employed to build the sport image diagnosis and implement 2-way feedback mechanism. This is for helping the students to create or modify the sport image, and strengthen the concept of action to set up the correct technical dynamic stereotype. The practice o...

  11. A Pseudo-Language for Creating CAI Programs on APL Systems

    Science.gov (United States)

    Gucker, Edward J.

    1973-01-01

    Encourages the use of APL as a language for computer assisted instruction (CAI) instead of such languages as BASIC or COURSEWRITER. Describes a set of APL functions that can simulate to some extent the features of COURSEWRITER, while permitting a more experienced course author to use the full mathematical power of APL. (Author/JF)

  12. Using CAI To Enhance the Peer Acceptance of Mainstreamed Students with Mild Disabilities.

    Science.gov (United States)

    Culliver, Concetta; Obi, Sunday

    This study applied computer-assisted instruction (CAI) techniques to improve peer acceptance among 92 mainstreamed students with mild disabilities from 10 to 13 years of age. Participants in the treatment group received their generalized curriculum program (including mathematics, language arts, reading, health, social studies, and science)…

  13. Web Pages: An Effective Method of Providing CAI Resource Material in Histology.

    Science.gov (United States)

    McLean, Michelle

    2001-01-01

    Presents research that introduces computer-aided instruction (CAI) resource material as an integral part of the second-year histology course at the University of Natal Medical School. Describes the ease with which this software can be developed, using limited resources and available skills, while providing students with valuable learning…

  14. Why igneous wollastonite is so rare in CAIs

    OpenAIRE

    Beckett, J. R.; Thrane, K.; Krot, A. N.

    2008-01-01

    Primary wollastonite (wo) thought to have crystallized from a liquid is quite rare in CAIs, having been reported in only two igneous inclusions, White Angel and KT-1 [1, 2]. Both of these CAIs exhibit significant mass fractionations in multiple elements and KT-1 is a FUN inclusion, so it is highly desirable to place as many constraints as possible on their formation. Since phase diagrams previously developed for CAIs do not involve wo [3], we use literature data on wo-satura...

  15. E-CAI: a novel server to estimate an expected value of Codon Adaptation Index (eCAI

    Directory of Open Access Journals (Sweden)

    Garcia-Vallvé Santiago

    2008-01-01

    Full Text Available Abstract Background The Codon Adaptation Index (CAI is a measure of the synonymous codon usage bias for a DNA or RNA sequence. It quantifies the similarity between the synonymous codon usage of a gene and the synonymous codon frequency of a reference set. Extreme values in the nucleotide or in the amino acid composition have a large impact on differential preference for synonymous codons. It is thence essential to define the limits for the expected value of CAI on the basis of sequence composition in order to properly interpret the CAI and provide statistical support to CAI analyses. Though several freely available programs calculate the CAI for a given DNA sequence, none of them corrects for compositional biases or provides confidence intervals for CAI values. Results The E-CAI server, available at http://genomes.urv.es/CAIcal/E-CAI, is a web-application that calculates an expected value of CAI for a set of query sequences by generating random sequences with G+C and amino acid content similar to those of the input. An executable file, a tutorial, a Frequently Asked Questions (FAQ section and several examples are also available. To exemplify the use of the E-CAI server, we have analysed the codon adaptation of human mitochondrial genes that codify a subunit of the mitochondrial respiratory chain (excluding those genes that lack a prokaryotic orthologue and are encoded in the nuclear genome. It is assumed that these genes were transferred from the proto-mitochondrial to the nuclear genome and that its codon usage was then ameliorated. Conclusion The E-CAI server provides a direct threshold value for discerning whether the differences in CAI are statistically significant or whether they are merely artifacts that arise from internal biases in the G+C composition and/or amino acid composition of the query sequences.

  16. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  17. Effective Computer Aided Instruction in Biomedical Science

    OpenAIRE

    Hause, Lawrence L.

    1985-01-01

    A menu-driven Computer Aided Instruction (CAI) package was integrated with word processing and effectively applied in five curricula at the Medical College of Wisconsin. Integration with word processing facilitates the ease of CAI development by instructors and was found to be the most important step in the development of CAI. CAI modules were developed and are currently used to reinforce lectures in medical pathology, laboratory quality control, computer programming and basic science reviews...

  18. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  19. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  20. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  1. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  2. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  3. Produktový mix firmy Pekařství Cais

    OpenAIRE

    NOVÁKOVÁ, Iveta

    2011-01-01

    The aim of my thesis was to describe product mix in a chosen company. I chose bakery Vladimír Cais in Vlachovo Březí for this work. Another aim was to analyze the product portfolio by means of the Boston Matrix and to propose possible modifications of the product portfolio based on the results. There were also a SWOT analysis and a product life cycle compiled within the analytic part.

  4. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  5. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  6. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  7. Forensic Analysis of Compromised Computers

    Science.gov (United States)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  8. Computability and Analysis, a Historical Approach

    OpenAIRE

    Brattka, Vasco

    2016-01-01

    The history of computability theory and and the history of analysis are surprisingly intertwined since the beginning of the twentieth century. For one, \\'Emil Borel discussed his ideas on computable real number functions in his introduction to measure theory. On the other hand, Alan Turing had computable real numbers in mind when he introduced his now famous machine model. Here we want to focus on a particular aspect of computability and analysis, namely on computability properties of theorem...

  9. A Petaflops Era Computing Analysis

    Science.gov (United States)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  10. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  11. Phenotypic diversity and correlation between white-opaque switching and the CAI microsatellite locus in Candida albicans.

    Science.gov (United States)

    Hu, Jian; Guan, Guobo; Dai, Yu; Tao, Li; Zhang, Jianzhong; Li, Houmin; Huang, Guanghua

    2016-08-01

    Candida albicans is a commensal fungal pathogen that is often found as part of the human microbial flora. The aim of the present study was to establish a relationship between diverse genotypes and phenotypes of clinical isolates of C. albicans. Totally 231 clinical isolates were collected and used for genotyping and phenotypic switching analysis. Based on the microsatellite locus (CAI) genotyping assay, 65 different genotypes were identified, and some dominant types were found in certain human niches. For example, the genotypes of 30-44 and 30-45 were enriched in vaginal infection samples. C. albicans has a number of morphological forms including the single-celled yeasts, multicellular filaments, white, and opaque cell types. The relationship between the CAI genotype and the ability to undergo phenotypic switching was examined in the clinical isolates. We found that the strains with longer CAA/G repeats in both alleles of the CAI locus were more opaque competent. We also discovered that some MTL heterozygous (a/alpha) isolates could undergo white-opaque switching when grown on regular culture medium (containing glucose as the sole carbon source). Our study establishes a link between phenotypic switching and genotypes of the CAI microsatellite locus in clinical isolates of C. albicans. PMID:26832141

  12. Relationship between Pre-Service Music Teachers' Personality and Motivation for Computer-Assisted Instruction

    Science.gov (United States)

    Perkmen, Serkan; Cevik, Beste

    2010-01-01

    The main purpose of this study was to examine the relationship between pre-service music teachers' personalities and their motivation for computer-assisted music instruction (CAI). The "Big Five" Model of Personality served as the framework. Participants were 83 pre-service music teachers in Turkey. Correlation analysis revealed that three…

  13. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  14. Interactive computer programs in sequence data analysis.

    OpenAIRE

    Jagadeeswaran, P; McGuire, P M

    1982-01-01

    We present interactive computer programs for the analysis of nucleic acid sequences. In order to handle these programs, minimum computer experience is sufficient. The nucleotide sequence of the human gamma globin gene complex is used as an example to illustrate the data analysis.

  15. Silicon Isotopic Fractionation of CAI-like Vacuum Evaporation Residues

    Energy Technology Data Exchange (ETDEWEB)

    Knight, K; Kita, N; Mendybaev, R; Richter, F; Davis, A; Valley, J

    2009-06-18

    Calcium-, aluminum-rich inclusions (CAIs) are often enriched in the heavy isotopes of magnesium and silicon relative to bulk solar system materials. It is likely that these isotopic enrichments resulted from evaporative mass loss of magnesium and silicon from early solar system condensates while they were molten during one or more high-temperature reheating events. Quantitative interpretation of these enrichments requires laboratory determinations of the evaporation kinetics and associated isotopic fractionation effects for these elements. The experimental data for the kinetics of evaporation of magnesium and silicon and the evaporative isotopic fractionation of magnesium is reasonably complete for Type B CAI liquids (Richter et al., 2002, 2007a). However, the isotopic fractionation factor for silicon evaporating from such liquids has not been as extensively studied. Here we report new ion microprobe silicon isotopic measurements of residual glass from partial evaporation of Type B CAI liquids into vacuum. The silicon isotopic fractionation is reported as a kinetic fractionation factor, {alpha}{sub Si}, corresponding to the ratio of the silicon isotopic composition of the evaporation flux to that of the residual silicate liquid. For CAI-like melts, we find that {alpha}{sub Si} = 0.98985 {+-} 0.00044 (2{sigma}) for {sup 29}Si/{sup 28}Si with no resolvable variation with temperature over the temperature range of the experiments, 1600-1900 C. This value is different from what has been reported for evaporation of liquid Mg{sub 2}SiO{sub 4} (Davis et al., 1990) and of a melt with CI chondritic proportions of the major elements (Wang et al., 2001). There appears to be some compositional control on {alpha}{sub Si}, whereas no compositional effects have been reported for {alpha}{sub Mg}. We use the values of {alpha}Si and {alpha}Mg, to calculate the chemical compositions of the unevaporated precursors of a number of isotopically fractionated CAIs from CV chondrites whose

  16. Computational methods for global/local analysis

    Science.gov (United States)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  17. Adjustment computations spatial data analysis

    CERN Document Server

    Ghilani, Charles D

    2011-01-01

    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  18. The ethnoecology of Caiçara metapopulations (Atlantic Forest, Brazil): ecological concepts and questions

    OpenAIRE

    Begossi Alpina

    2006-01-01

    Abstract The Atlantic Forest is represented on the coast of Brazil by approximately 7,5% of remnants, much of these concentrated on the country's SE coast. Within these southeastern remnants, we still find the coastal Caiçaras who descend from Native Indians and Portuguese Colonizers. The maintenance of such populations, and their existence in spite of the deforestation that occurred on the Atlantic Forest coast, deserves especial attention and analysis. In this study, I address, in particula...

  19. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  20. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  1. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  2. The Intelligent CAI System for Chemistry Based on Automated Reasoning

    Institute of Scientific and Technical Information of China (English)

    王晓京; 张景中

    1999-01-01

    A new type of intelligent CAI system for chemistry is developed in this paper based on automated reasoning with chemistry knowledge.The system has shown its ability to solve chemistry problems,to assist students and teachers in studies and instruction with the automated reasoning functions.Its open mode of the knowledge base and its unique style of the interface between the system and human provide more opportunities for the users to acquire living knowledge through active participation.The automated reasoning based on basic chemistry knowledge also opened a new approach to the information storage and management of the ICAI system for sciences.

  3. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  4. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  5. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  6. Computer analysis of ESR spectra

    International Nuclear Information System (INIS)

    Author. Isotropic ESR spectra often display complicated patterns which are difficult to analyze for their hyperfine splitting constants (HSC). To simplify the analysis, we have written a program suitable for PC's for sufficiently iterating simulations of isotropic ESR spectra and determining the simulation which fits the experimental spectra. Chapter one gives a brief introduction to the theory of electron spin resonance (ESR). In chapter two the main concepts of the program are presented. Auto simulate is the main algorithm. It calculates the entire field of valid simulations to ensure that the solution set contains all parameter combinations which produce satisfactory spectra. Auto simulate requires prior knowledge of the HSCs and other parameters needed for the simulation such as the line width, the spectrum width, and the number of magnetic nuclei. Proton Coupling Constant Extraction (PCCE) and autocorrelation are two methods complementing each other to determine the HSCs. Another iterative method based on a systematic application of Monte Carlo method can be applied to generate more accurate values of the line width. In chapter three, the spectra of Naphthalene, Tetracene, Indigo, Ox-indigo semi quinone, thio-indigo and 2,2'-dipyridyl-Na complex free radicals are analyzed. The results are compared to the literature value, good agreement is obtained for different resolution and noise to signal ratios. In the last chapter a print out of the program is presented. The programming language used is Microsoft QuickBASIC version 7.1

  7. Computer aided nonlinear electrical networks analysis

    Science.gov (United States)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  8. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  9. Interactive computer analysis of nuclear backscattering spectra

    International Nuclear Information System (INIS)

    A review will be made of computer-based interactive nuclear backscattering analysis system. Users without computer experience can develop moderate competence with the system after only brief instruction because of the menu-driven organization. Publishable quality figures can be obtained without any computer expertise. Among the quantities which can be displayed over the data are depth scales for any element, element identification, relative concentrations and theoretical spectra. Captions and titling can made from a selection of 30 font styles. Lettering is put on the graphs umder joy-stick control such that placement is exact without needing complicated commands. (orig.)

  10. Computer Language Effciency via Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Andrea Ellero

    2011-01-01

    Full Text Available The selection of the computer language to adopt is usually driven by intuition and expertise, since it is very diffcult to compare languages taking into account all their characteristics. In this paper, we analyze the effciency of programming languages through Data Envelopment Analysis. We collected the input data from The Computer Language Benchmarks Game: we consider a large set of languages in terms of computational time, memory usage, and source code size. Various benchmark problems are tackled. We analyze the results first of all considering programming languages individually. Then, we evaluate families of them sharing some characteristics, for example, being compiled or interpreted.

  11. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  12. CAI in New York City: Report on the First Year's Operations

    Science.gov (United States)

    Butler, Cornelius F.

    1969-01-01

    "The nation's largest CAI operation in a public school system concluded its first full year of operation in June, 1969. The results indicate a very definite success for education's most closely watched use of technology. Three major criteria for success of such a project are 1) acceptance of CAI by the schools and their pupils, 2) per pupil costs…

  13. Structural basis of Na+-independent and cooperative substrate/product antiport in CaiT

    NARCIS (Netherlands)

    Schulze, Sabrina; Köster, Stefan; Geldmacher, Ulrike; Terwisscha van Scheltinga, Anke C.; Kühlbrandt, Werner

    2010-01-01

    Transport of solutes across biological membranes is performed by specialized secondary transport proteins in the lipid bilayer, and is essential for life. Here we report the structures of the sodium-independent carnitine/butyrobetaine antiporter CaiT from Proteus mirabilis (PmCaiT) at 2.3-Å and from

  14. Brief Introduction to the Foundation of CAI Shidong Award for Plasma Physics

    Institute of Scientific and Technical Information of China (English)

    SHENG Zhengming

    2010-01-01

    @@ The late Academician Professor CAI Shidong was an outstanding plasma physicist who had made seminal contributions in both fundamental plasma theories and controlled thermonuclear fusion energy research.Professor CAI was also one of the pioneers in China's plasma physics research.In 1973,Professor CAI decided to leave U.S.and return to China in order to help pushing forward plasma physics research in China.Professor CAI formed a research group consisting of young scientists and carried out high-level works in this important physics discipline.He worked tirelessly,set examples by his own deeds,and made outstanding contributions in plasma physics research,educating younger generations of plasma physicists,as well as establishing collaborations with plasma scientists in other Asian-African developing nations.In short,Professor CAI devoted the best years of his life to China's plasma physics research.

  15. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    DEFF Research Database (Denmark)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.;

    2014-01-01

    We present a detailed characterization of the mineralogy, petrology, and oxygen isotopic compositions of twelve FUN CAIs, including C1 and EK1-4-1 from Allende (CV), that were previously shown to have large isotopic fractionation patterns for magnesium and oxygen, and large isotopic anomalies of...... several elements. The other samples show more modest patterns of isotopic fractionation and have smaller but significant isotopic anomalies. All FUN CAIs studied are coarse-grained igneous inclusions: Type B, forsterite-bearing Type B, compact Type A, and hibonite-rich. Some inclusions consist of two...... mineralogically distinct lithologies, forsterite-rich and forsterite-free/poor. All the CV FUN CAIs experienced postcrystallization open-system iron-alkali-halogen metasomatic alteration resulting in the formation of secondary minerals commonly observed in non-FUN CAIs from CV chondrites. The CR FUN CAI GG#3...

  16. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP)

  17. Computation of Regularized Linear Discriminant Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Valenta, Zdeněk; Duintjer Tebbens, Jurjen

    ISI, 2014. s. 8-8. [COMPSTAT 2014. International Conference on Computational Statistics /21./. 19.08.2014-22.08.2014, Geneva] Institutional support: RVO:67985807 Keywords : classification analysis * regularization * Matrix decomposition * shrinkage eigenvalues * high-dimensional data Subject RIV: BB - Applied Statistics, Operational Research

  18. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  19. Risk analysis enhancement via computer applications

    International Nuclear Information System (INIS)

    Since the development of Reliability Centered Maintenance (RCM) by the airline industry, there has been various alternative approaches to applying this methodology to the nuclear power industry. Some of the alternatives were developed in order to shift the focus of analyses on plant specific concerns but the greatest majority of alternatives were developed in attempt to reduce the effort required to conduct a RCM analysis on as large of scale as a nuclear power station. Computer applications have not only reduced the amount of analysis time but have also produced more consistent results, provided an effective working RCM analysis tool and made it possible to automate a Living Program. During the development of a RCM Program at South Carolina Electric and Gas' V.C. Summer Nuclear Station (VCSNS), computer applications were developed. 6 figs, 1 tab

  20. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...... on information obtained from software profiling and the resulting design is validated through cosimulation. The achieved speed-up is estimated based on an analysis of profiling information from different sets of input data and various architectural options....

  1. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  2. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...... on information obtained from software profiling and the resulting design is validated through cosimulation. The achieved speed-up is estimated based on an analysis of profiling information from different sets of input data and various architectural options....

  3. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  4. The impact of computer-based interactive instruction (CBII) in improving the teaching-learning process in introductory college physics

    Science.gov (United States)

    Jawad, Afif A.

    Institutes are incorporating computer-assisted instruction (CAI) into their classrooms in an effort to enhance learning. The implementation of computers into the classroom is parallel with education's role of keeping abreast with societal demands. The number of microcomputers in schools has increased tremendously. Computer Based Interactive Instruction (CBBI) software is available for the language arts, mathematics, science, social studies, etc. The traditional instruction, supplemented with CAI, seems to be more effective than traditional instruction alone. Although there is a large quantity of research regarding specific aspects of learning through computers, there seems to be a lack of information regarding the impact of computers upon student success. The goal of this study is to determine how much of CAI is implemented in higher education in the USA. Instructors from 38 states were surveyed to compare between the institutes that use Computer Based Interactive Instruction and the ones that do not and are still applying traditional delivery method. Based on the analysis of the data gathered during this study, it is concluded that the majority of instructors are now using computers in one form or another. This study has determined that the computer is a major component in the teaching of introductory physics, and therefore, may be a suitable substitute for the traditional delivery system. Computers as an instructional delivery system are an alternative that may result in a higher level of student learning for many higher education courses.

  5. Probabilistic structural analysis computer code (NESSUS)

    Science.gov (United States)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  6. Al-Mg systematics of CAIs, POI, and ferromagnesian chondrules from Ningqiang

    OpenAIRE

    Hsu, Weibiao; Huss, Gary R.; Wasserburg, G. J.

    2003-01-01

    We have made aluminum-magnesium isotopic measurements on 4 melilite-bearing calcium-aluminum-rich inclusions (CAIs), 1 plagioclase-olivine inclusion (POI), and 2 ferromagnesian chondrules from the Ningqiang carbonaceous chondrite. All of the CAIs measured contain clear evidence for radiogenic ^(26)Mg^* from the decay of ^(26)Al (τ = 1.05 Ma). Although the low Al/Mg ratios of the melilites introduce large uncertainties, the inferred initial ^(26)Al/^(27)Al ratios for the CAIs are generally con...

  7. DC operating point analysis using evolutionary computing

    OpenAIRE

    Crutchley, DA; Zwolinski, M.

    2004-01-01

    This paper discusses and evaluates a new approach to operating point analysis based on evolutionary computing (EC). EC can find multiple solutions to a problem by using a parallel search through a population. At the operating point(s) of a circuit the overall error has a minimum value. Therefore, we use an Evolutionary Algorithm (EA) to search the solution space to find these minima. Various evolutionary algorithms are described. Several such algorithms have been implemented in a full circuit...

  8. Computed tomographic analysis of urinary calculi

    Energy Technology Data Exchange (ETDEWEB)

    Newhouse, J.H.; Prien, E.L.; Amis, E.S. Jr.; Dretler, S.P.; Pfister, R.C.

    1984-03-01

    Excised urinary calculi were subjected to computed tomographic (CT) scanning in an attempt to determine whether CT attenuation values would allow accurate analysis of stone composition. The mean, maximum, and modal pixel densities of the calculi were recorded and compared; the resulting values reflected considerable heterogeneity in stone density. Although uric acid and cystine calculi could be identified by their discrete ranges on one or more of these criteria, calcium-containing stones of various compositions, including struvite, could not be distinguished reliably. CT analysis of stone density is not likely to be more accurate than standard radiography in characterizing stone composition in vivo.

  9. Computer analysis of HIV epitope sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, G.; Myers, G.

    1990-01-01

    Phylogenetic tree analysis provide us with important general information regarding the extent and rate of HIV variation. Currently we are attempting to extend computer analysis and modeling to the V3 loop of the type 2 virus and its simian homologues, especially in light of the prominent role the latter will play in animal model studies. Moreover, it might be possible to attack the slightly similar V4 loop by this approach. However, the strategy relies very heavily upon natural'' information and constraints, thus there exist severe limitations upon the general applicability, in addition to uncertainties with regard to long-range residue interactions. 5 refs., 3 figs.

  10. CAI课件在《家畜寄生虫学》教学中的应用%The use of CAI courseware in veterinary parasitology teaching

    Institute of Scientific and Technical Information of China (English)

    王建民; 姚龙泉; 刘明春; 何剑斌; 葛云侠

    2012-01-01

    通过多种途径收集素材,制备适合动物医学专业学生使用的《家畜寄生虫学》CAI课件,使原来枯燥的讲解课程变成生动的展示课程.该课件帮助同学们日后对寄生虫病诊断以及寄生虫分类奠定了良好基础.%Collecting materials by diverseness ways, and preparation computer assisted instruction ( CAI) courseware of animal parasitology which was suit for animal medicine undergraduate students using. The CAI of animal parasitology had turned the original boring lectures into vivid display courses. The courseware would establish a satisfactory basis for students diagnosis and classification parasitosis in the future work.

  11. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    Directory of Open Access Journals (Sweden)

    V. V. Postnov

    2014-07-01

    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  12. 77 FR 9625 - Presentation of Final Conventional Conformance Test Criteria and Common Air Interface (CAI...

    Science.gov (United States)

    2012-02-17

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE National Institute of Standards and Technology Presentation of Final Conventional Conformance Test Criteria and Common Air Interface (CAI) Features/Functionalities Under Test in the Project 25...

  13. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  14. Strong Calcite-Like Spectra Cathodoluminescence Emission from Allende Meteorite Cai Phases

    OpenAIRE

    García Guinea, Javier; Tornos Arroyo, Fernando; Azumendi García, Oscar; Ruiz Pérez, Javier; Correcher Delgado, Virgilio

    2011-01-01

    Calcium–aluminum-rich inclusions (CAIs) of Allende CV3 chondrite were studied by Environmental Scanning Electron Microscopy (ESEM), Energy Dispersive Spectrometry (EDS), Backscattering (BS), and Spectra Cathodoluminescence (CL). CAI minerals show spectra CL curves exceeding the 450,000 a.u. with a large homogeneity along the white inclusions. CL curve features fit perfectly with terrestrial patterns of stressed specimens of weathered marble and limestone in which hydroxyl gr...

  15. Design of CAI Courseware Based on Virtual Reality Mechanism%基于VR机制的CAI课件设计

    Institute of Scientific and Technical Information of China (English)

    管群

    2001-01-01

    In this paper,the application feature and significance of VR technology in the educational field are summarized.In particular,the design mechanism of CAI courseware of the instruction aiming at individuals is studied,and with virtual reality mechanism a learning-while-doing environment is realized for the user in the CAI courseware in the major of the computer application.The design theory,the technique way,some of the algorithm flowchart and the interface of the exercise of operation are given.%论述了虚拟现实技术在教育领域中的应用特点和重要意义。特别研究了针对个别化教学的CAI课件设计机制,并运用虚拟现实机制在计算机应用类CAI课件设计中实现了一个可供用户边学边做的学习环境。给出了设计原理、技术路线、部分算法流程和操作练习界面。

  16. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Peng Qin

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  17. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  18. Computation of Regularized Linear Discriminant Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Valenta, Zdeněk; Duintjer Tebbens, Jurjen

    Geneva: Centre International de Conferences, 2014 - (Gilli, M.; Nieto-Reyes, A.; González-Rodríguez, G.), s. 1-8 ISBN 978-2-8399-1347-8. [COMPSTAT 2014. International Conference on Computational Statistics /21./. Geneva (CH), 19.08.2014-22.08.2014] R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : classification analysis * regularization * Matrix decomposition * shrinkage eigenvalues * high-dimensional data Subject RIV: BB - Applied Statistics, Operational Research

  19. Computer modelling for LOCA analysis in PHWRs

    International Nuclear Information System (INIS)

    A computer code THYNAC developed for analysis of thermal hydraulic transient phenomena during LOCA in the PHWR type reactor and primary coolant system is described. The code predicts coolant voiding rate in the core, coolant discharge rate from the break, primary system depressurization history and temperature history of both fuel and fuel clad. Reactor system is modelled as a set of connected fluid segments which represent piping, feeders, coolant channels, etc. Method of finite difference is used in the code. Modelling of various specific phenomena e.g. two-phase pressure drop, slip flow, pumps etc. in the code is described. (M.G.B.)

  20. Progress in computer vision and image analysis

    CERN Document Server

    Bunke, Horst; Sánchez, Gemma; Otazu, Xavier

    2009-01-01

    This book is a collection of scientific papers published during the last five years, showing a broad spectrum of actual research topics and techniques used to solve challenging problems in the areas of computer vision and image analysis. The book will appeal to researchers, technicians and graduate students. Sample Chapter(s). Chapter 1: An Appearance-Based Method for Parametric Video Registration (2,352 KB). Contents: An Appearance-Based Method for Parametric Video Registration (X Orriols et al.); Relevance of Multifractal Textures in Static Images (A Turiel); Potential Fields as an External

  1. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  2. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  3. Microstructure and effective behavior - analysis and computation

    International Nuclear Information System (INIS)

    Material behavior is determined by features on a number of length scales between the atomistic and macroscopic scale. As full direct resolution of all scales is out of reach there is an intense research on analytical and computational tools that can bridge different scales and a number of different schemes have been proposed. One key issue is to identify which information on the finer scale is needed to determine the behavior on the coarser scale. To shed some light on this issue we will focus on number of case studies to understand the passage from macroscopic scales, where the material is described by a multi-well non-convex energy, to macroscopic behavior. Examples include shape-memory materials, new giant magnetostrictive materials and nematic elastomers. Similar ideas have been used by others and by us to understand dislocation arrangements, blistering of thin films and magnetic microstructures. We will discuss three algorithmic approaches to analyze effective behavior: purely analytical, hybrid analytical-computational and computation inspired by analysis. Refs. 5 (author)

  4. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  5. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  6. Experimental analysis of computer system dependability

    Science.gov (United States)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  7. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  8. Good relationships between computational image analysis and radiological physics

    International Nuclear Information System (INIS)

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics

  9. Good relationships between computational image analysis and radiological physics

    Science.gov (United States)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  10. Good relationships between computational image analysis and radiological physics

    Energy Technology Data Exchange (ETDEWEB)

    Arimura, Hidetaka, E-mail: arimurah@med.kyushu-u.ac.jp [Division of Medical Quantum Science, Department of Health Sciences, Faculty of Medical Sciences, Kyushu University (Japan); Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen [Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University (Japan)

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  11. The Anatomy and Bulk Composition of CAI Rims in the Vigarano (CV3) Chondrite

    Science.gov (United States)

    Ruzicka, A.; Boynton, W. V.

    1993-07-01

    A striking feature of Ca,Al-rich inclusions (CAIs) in chondrites is the presence of mineralogical layers that typically form rim sequences up to 50 micrometers thick [1]. Many ideas regarding the origin of CAI rims have been proposed, but none are entirely satisfactory. The detailed mineralogy and bulk compositions of relatively unaltered CAI rims in the Vigarano (CV3) chondrite described here provide constraints on hypotheses of rim formation. Rim Mineralogy: CAIs in Vigarano consist of melilite (mel)- and spinel (sp)- rich varieties, both of which are rimmed [2]. Around mel-rich objects, the layer sequence is CAI interior --> sp-rich layer (sometimes absent) --> mel/anorthite (anor) layer --> Ti-Al-rich clinopyroxene (Tpx) layer --> Al- diopside (Al-diop) layer --> olivine (ol) +/- Al-diop layer --> host matrix. The sequence around sp-rich objects differs from this in that the mel/anor layer is absent. Both the sp-rich layer around mel-cored CAIs and the cores of sp-rich CAIs in Vigarano are largely comprised of a fine-grained (anor layer is sometimes monomineralic, consisting of mel alone, or bimineralic, consisting of both mel and anor. Where bimineralic, anor typically occurs in the outer part of the layer. In places, anor (An(sub)99-100) has partially altered to nepheline and voids. Rim mel is systematically less gehlenitic than mel in the CAI interiors, especially compared to mel in the interior adjacent to the rims. The Tpx layer (>2 and up to 15 wt% TiO2) and Al-diop layer ( sp + fo --> sp + fo + anor or mel or Tpx) that does not correspond to observed rim sequences. It thus appears that (1) the rim region did not form through crystallization of molten CAIs; and (2) rim layers did not originate solely by the crystallization of a melt layer present on a solid CAI core [4,5]. References: [1] Wark D. A. and Lovering J. F. (1977) Proc. LSC 8th, 95-112. [2] Ruzicka A. and Boynton W. V. (1991) Meteoritics, 26, 390-391. [3] Stolper E. (1982) GCA, 46, 2159

  12. Framework for Computer Assisted Instruction Courseware: A Case Study.

    Science.gov (United States)

    Betlach, Judith A.

    1987-01-01

    Systematically investigates, defines, and organizes variables related to production of internally designed and implemented computer assisted instruction (CAI) courseware: special needs of users; costs; identification and definition of realistic training needs; CAI definition and design methodology; hardware and software requirements; and general…

  13. Computational analysis of PARAMETR facility experiments

    International Nuclear Information System (INIS)

    Full text of publication follows: Results of calculation of PARAMETR experiments are given in the paper. The PARAMETR facility is designed to research the phenomena relevant to typical LOCA scenarios (including severe accident) of VVER type reactors. The investigations at PARAMETR facility are directed to experimental research of fuel rods and core materials behavior, hydrogen generation processes, melting and interaction of core materials during severe accidents. The main facility components are rod bundle of 1250 mm heated length (up to 37 rods can be used), electrical power source, steam and water supply systems and instrumentation. The bundle is a mix of fresh fuel rods and electrically heated rods with uranium tablets and tungsten heater inside. The main objectives of calculations are analysis of computer code capability, in particular, RELAP/SCDAPSIM, to model severe accidents, identification of major parameter impact on calculation results and thus accident analysis improvements. RELAP/SCDAPSIM calculations were used to choose key parameters of experiments. Analysis of influence of thermal insulation properties, uncertainties of heater geometry, insulation thermal conductivity was done. Conditions and parameters needed to burn up intensive zirconium reaction were investigated. As a whole, calculation results showed good agreement with experiments. Some key points were observed such as essential impact of preheating phase, importance of thermal insulation material properties. Proper modeling of particular processes during preheating phase was very important since this phase defined bundle temperature level at the heating phase. There were some difficulties here. For instance, overestimation of temperatures had been observed until axial profiling of thermal conductivity was introduced. Some more proper models were used to reach the better agreement with experiments. The work done can be used in safety analysis of VVER type reactors and allow improving of

  14. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  15. Computed tomographic analysis of renal calculi

    Energy Technology Data Exchange (ETDEWEB)

    Hillman, B.J.; Drach, G.W.; Tracey, P.; Gaines, J.A.

    1984-03-01

    An in vitro study sought to determine the feasibility of using computed tomography (CT) to analyze the chemical composition of renal calculi and thus aid in selecting the best treatment method. Sixty-three coded calculi were scanned in a water bath. Region-of-interest measurements provided the mean, standard deviation, and minimum and maximum pixel values for each stone. These parameters were correlated with aspects of the stones' chemical composition. A multivariate analysis showed that the mean and standard deviation of the stones' pixel values were the best CT parameters for differentiating types of renal calculi. By using computerized mapping techniques, uric acid calculi could be perfectly differentiated from struvite and calcium oxalate calculi. The latter two types also were differentiable, but to a lesser extent. CT has a potential role as an adjunct to clinical and laboratory methods for determining the chemical composition of renal calculi in an effort to select optimal treatment.

  16. Review of Computational Stirling Analysis Methods

    Science.gov (United States)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  17. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    devise policies to minimize them. These activities include cybercrimes, terrorist attacks or violent actions in response to certain world issues. Beside such activities, there are several other related activities worth analyzing, for which computational models have been presented in this thesis....... These models include a model for analyzing evolution of terrorist networks; a text classification model for detecting suspicious text and identification of suspected authors of anonymous emails; and a semantic analysis model for news reports, which may help analyze the illicit activities in certain area...... with location and temporal information. For the network evolution, the hierarchical agglomerative clustering approach has been applied to terrorist networks as case studies. The networks' evolutions show that how individual actors who are initially isolated from each other are converted in small groups, which...

  18. Computed tomographic analysis of renal calculi

    International Nuclear Information System (INIS)

    An in vitro study sought to determine the feasibility of using computed tomography (CT) to analyze the chemical composition of renal calculi and thus aid in selecting the best treatment method. Sixty-three coded calculi were scanned in a water bath. Region-of-interest measurements provided the mean, standard deviation, and minimum and maximum pixel values for each stone. These parameters were correlated with aspects of the stones' chemical composition. A multivariate analysis showed that the mean and standard deviation of the stones' pixel values were the best CT parameters for differentiating types of renal calculi. By using computerized mapping techniques, uric acid calculi could be perfectly differentiated from struvite and calcium oxalate calculi. The latter two types also were differentiable, but to a lesser extent. CT has a potential role as an adjunct to clinical and laboratory methods for determining the chemical composition of renal calculi in an effort to select optimal treatment

  19. Computational based functional analysis of Bacillus phytases.

    Science.gov (United States)

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  20. Mineralogy and Petrology of EK-459-5-1, A Type B1 CAI from Allende

    Science.gov (United States)

    Jeffcoat, C. R.; Kerekgyarto, A. G.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2015-01-01

    Calcium-aluminum-rich inclusions (CAIs) are a type of coarse-grained clast composed of Ca-, Al-, and Mg-rich silicates and oxides found in chondrite meteorites. Type B (CAIs) are exclusively found in the CV chondrite meteorites and are the most well studied type of inclusion found in chondritic meteorites. Type B1 CAIs are distinguished by a nearly monomineralic rim of melilite that surrounds an interior predominantly composed of melilite, fassaite (Ti and Al-rich clinopyroxene), anorthite, and spinel with varying amounts of other minor primary and secondary phases. The formation of Type B CAIs has received considerable attention in the course of CAI research and quantitative models, experimental results and observations from Type B inclusions remain largely in disagreement. Recent experimental results and quantitative models have shown that the formation of B1 mantles could have occurred by the evaporative loss of Si and Mg during the crystallization of these objects. However, comparative studies suggest that the lower bulk SiO2 compositions in B1s result in more prior melilite crystallization before the onset of fassaite and anorthite crystallization leading to the formation of thick melilite rich rims in B1 inclusions. Detailed petrographic and cosmochemical studies of these inclusions will further our understanding of these complex objects.

  1. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Computer Assisted Laboratory Instructions: Learning Outcomes Analysis

    OpenAIRE

    Abdulrasool, Salah Mahdi; Mishra, Rakesh

    2006-01-01

    For this students in mechanical engineering subject area were exposed to computer assisted instructions to satisfy following learning outcomes in computer aided design/computer aided manufacturing module. i- Creation of drawing and design using Computer aided design ii- Using data exchange format (DXF) to create numerical control file iii- Final setup check of computerised numerical control machine iv- Final manufacturing of the product using CNC v- e ytilauQ valuation The t...

  4. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  5. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  6. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  7. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  8. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  9. [Computational genome analysis of three marine algoviruses].

    Science.gov (United States)

    Stepanova, O A; Boĭko, A L; Shcherbatenko, I S

    2013-01-01

    Computational analysis of genomic sequences of three new marine algoviruses: Tetraselmis viridis virus (TvV-S20 and TvV-SI1 strains) and Dunaliella viridis virus (DvV-SI2 strain) was conducted. Both considerable similarity and essential distinctions between studied strains and the most studied marine algoviruses of Phycodnaviridae family were revealed. Our data show that the tested strains are new viruses with the following features: only they were isolated from marine eukaryotic microalgae T. viridis and D. viridis, coding sequences (CDSs) of their genomes are localized mainly on one of the DNA strands and form several clusters with short intergenic spaces; there are considerable variations in genome structure within viruses and their strains; viral genomic DNA has a high GC-content (55.5 - 67.4%); their genes contain no well-known optimal contexts of translation start codones, and the contexts of terminal codons read-through; the vast majority of viral genes and proteins do not have any matches in gene banks. PMID:24479317

  10. The CMS computing, software and analysis challenge

    International Nuclear Information System (INIS)

    The CMS experiment has performed a comprehensive challenge during May 2008 to test the full scope of offline data handling and analysis activities needed for data taking during the first few weeks of LHC collider operations. It constitutes the first full-scale challenge with large statistics under the conditions expected at the start-up of the LHC, including the expected initial mis-alignments and mis-calibrations for each sub-detector, and event signatures and rates typical for low instantaneous luminosity. Particular emphasis has been given to the prompt reconstruction workflows, and to the procedures for the alignment and calibration of each sub-detector. The latter were performed with restricted latency using the same computing infrastructure that will be used for real data, and the resulting calibration and alignment constants were used to re-reconstruct the data at Tier-1 centres. The paper addresses the goals and practical experience from the challenge, as well as the lessons learned in view of LHC data taking.

  11. The Use of Modular Computer-Based Lessons in a Modification of the Classical Introductory Course in Organic Chemistry.

    Science.gov (United States)

    Stotter, Philip L.; Culp, George H.

    An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…

  12. Gender Role, Gender Identity and Sexual Orientation in CAIS ("XY-Women") Compared With Subfertile and Infertile 46,XX Women.

    Science.gov (United States)

    Brunner, Franziska; Fliegner, Maike; Krupp, Kerstin; Rall, Katharina; Brucker, Sara; Richter-Appelt, Hertha

    2016-01-01

    The perception of gender development of individuals with complete androgen insensitivity syndrome (CAIS) as unambiguously female has recently been challenged in both qualitative data and case reports of male gender identity. The aim of the mixed-method study presented was to examine the self-perception of CAIS individuals regarding different aspects of gender and to identify commonalities and differences in comparison with subfertile and infertile XX-chromosomal women with diagnoses of Mayer-Rokitansky-Küster-Hauser syndrome (MRKHS) and polycystic ovary syndrome (PCOS). The study sample comprised 11 participants with CAIS, 49 with MRKHS, and 55 with PCOS. Gender identity was assessed by means of a multidimensional instrument, which showed significant differences between the CAIS group and the XX-chromosomal women. Other-than-female gender roles and neither-female-nor-male sexes/genders were reported only by individuals with CAIS. The percentage with a not exclusively androphile sexual orientation was unexceptionally high in the CAIS group compared to the prevalence in "normative" women and the clinical groups. The findings support the assumption made by Meyer-Bahlburg ( 2010 ) that gender outcome in people with CAIS is more variable than generally stated. Parents and professionals should thus be open to courses of gender development other than typically female in individuals with CAIS. PMID:26133743

  13. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Science.gov (United States)

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  14. Simplified computer codes for cask impact analysis

    International Nuclear Information System (INIS)

    In regard to the evaluation of the acceleration and deformation of casks, the simplified computer codes make analyses economical and decrease input and calculation time. The results obtained by the simplified computer codes have enough adequacy for their practical use. (J.P.N.)

  15. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  16. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  17. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  18. Introduction to numerical analysis and scientific computing

    CERN Document Server

    Nassif, Nabil

    2013-01-01

    Computer Number Systems and Floating Point Arithmetic Introduction Conversion from Base 10 to Base 2Conversion from Base 2 to Base 10Normalized Floating Point SystemsFloating Point OperationsComputing in a Floating Point SystemFinding Roots of Real Single-Valued Functions Introduction How to Locate the Roots of a Function The Bisection Method Newton's Method The Secant MethodSolving Systems of Linear Equations by Gaussian Elimination Mathematical Preliminaries Computer Storage for Matrices. Data Structures Back Substitution for Upper Triangular Systems Gauss Reduction LU DecompositionPolynomia

  19. Performance Analysis of Various New Technologies for Computing

    OpenAIRE

    M Nagaraju; Anitha, B

    2012-01-01

    The discipline of computing is the systematic study of algorithmicprocesses that describe and transform information along with their theory, analysis, design, efficiency, implementation, and application. Application software, also known as an "application" or an "app", iscomputer softwaredesigned to help the user to perform specific tasks. Recent interests and demand in computing made new technologies to emerge in which cloud computing is the one. Cloud Computing has become another buzzword a...

  20. A Comparative Analysis of Computer Literacy Education for Nurses

    OpenAIRE

    Hardin, Richard C.; Skiba, Diane J.

    1982-01-01

    Despite recent advances by nursing in the computer field computer literacy is a rarity among nursing professionals. Our analysis of existing educational models in nursing (baccalaureate, staff development, continuing education, and vendor) shows that no single educational strategy is likely to be effective in achieving computer literacy for all nurses. A refinement of the computer literacy concept is proposed which divides the educational needs of nurses into specific objectives based on desi...

  1. Modern Computational Techniques for the HMMER Sequence Analysis

    OpenAIRE

    Xiandong Meng; Yanqing Ji

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and c...

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    OpenAIRE

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...

  3. ANACROM - A computer code for chromatogram analysis

    International Nuclear Information System (INIS)

    The computer code was developed for automatic research of peaks and evaluation of chromatogram parameters as : center, height, area, medium - height width (FWHM) and the rate FWHM/center of each peak. (Author)

  4. Behavior computing modeling, analysis, mining and decision

    CERN Document Server

    2012-01-01

    Includes six case studies on behavior applications Presents new techniques for capturing behavior characteristics in social media First dedicated source of references for the theory and applications of behavior informatics and behavior computing

  5. Towed Water Turbine Computational Fluid Dynamics Analysis

    OpenAIRE

    Maughan, Robert G.

    2013-01-01

    Computational fluid dynamics can be used to predict operating conditions of towed water turbines which are used in long distance sailing applications to meet electrical demands. The design consists of a turbine fastened to a shaft which is attached to a generator by a rope. The turbine is pulled in water behind a sailboat and torque is transmitted through the rope to turn the onboard generator and produce power. Torque curves from an alternator, generator, and from computational fluid dynamic...

  6. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  7. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  8. Granular computing analysis and design of intelligent systems

    CERN Document Server

    Pedrycz, Witold

    2013-01-01

    Information granules, as encountered in natural language, are implicit in nature. To make them fully operational so they can be effectively used to analyze and design intelligent systems, information granules need to be made explicit. An emerging discipline, granular computing focuses on formalizing information granules and unifying them to create a coherent methodological and developmental environment for intelligent system design and analysis. Granular Computing: Analysis and Design of Intelligent Systems presents the unified principles of granular computing along with its comprehensive algo

  9. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Science.gov (United States)

    Garrett, T. J.; And Others

    1987-01-01

    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  10. The Effectiveness of Computer-Assisted Instruction in Teaching Introductory Statistics

    Science.gov (United States)

    Basturk, Ramazan

    2005-01-01

    The focus of this study is to demonstrate and discuss the educational advantages of Computer Assisted Instruction (CAI). A quasi-experimental design compared learning outcomes of participants in an introductory statistics course that integrated CAI to participants in a Lecture-only introductory statistics course. Reviews of participants' identical…

  11. Critical Thinking Outcomes of Computer-Assisted Instruction versus Written Nursing Process.

    Science.gov (United States)

    Saucier, Bonnie L.; Stevens, Kathleen R.; Williams, Gail B.

    2000-01-01

    Nursing students (n=43) who used clinical case studies via computer-assisted instruction (CAI) were compared with 37 who used the written nursing process (WNP). California Critical Thinking Skills Test results did not show significant increases in critical thinking. The WNP method was more time consuming; the CAI group was more satisfied. Use of…

  12. The Effects of Trait Anxiety and Dogmatism on State Anxiety During Computer-Assisted Learning.

    Science.gov (United States)

    Rappaport, Edward

    In this study of the interaction between anxiety trait (A-trait), anxiety state (A-state), and dogmatism in computer-assisted instruction (CAI), subjects were selected on the basis of extreme scores on a measure of anxiety and on a measure of dogmatism. The subjects were presented with a CAI task consisting of difficult mathematical problems. The…

  13. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  14. The Evolution of Instructional Design Principles for Intelligent Computer-Assisted Instruction.

    Science.gov (United States)

    Dede, Christopher; Swigger, Kathleen

    1988-01-01

    Discusses and compares the design and development of computer assisted instruction (CAI) and intelligent computer assisted instruction (ICAI). Topics discussed include instructional systems design (ISD), artificial intelligence, authoring languages, intelligent tutoring systems (ITS), qualitative models, and emerging issues in instructional…

  15. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  16. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...

  17. 地学領域における CAI の実践的研究(II) : 中・高等学校における指導例

    OpenAIRE

    Hayashi, Takehiro; Tanaka, Masaki; Arita, Masashi; Suzuki, Morihisa

    1993-01-01

    On CAI in earth science education, the authors have a principle that interests of students in natural materials and phenomena must be promoted by the effective use of computer. Based upon the principle, BASIC programs for drawing the three-dimensional topographic map of western Hiroshima Prefecture are developed. Using the programs, given are earth science instructions in junior and senior high schools. Through the instructions, the students are progressed to be interested in the topography o...

  18. Cloud Computing for Rigorous Coupled-Wave Analysis

    Directory of Open Access Journals (Sweden)

    N. L. Kazanskiy

    2012-01-01

    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  19. Wing analysis using a transonic potential flow computational method

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  20. Computational thermo-fluid analysis of a disk brake

    Science.gov (United States)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  1. Computational analysis of ozonation in bubble columns

    International Nuclear Information System (INIS)

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  2. Advances in Computer-Based Autoantibodies Analysis

    Science.gov (United States)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  3. A computer program for PV systems analysis

    International Nuclear Information System (INIS)

    A computer program for analyzing solar cells and photovoltaic (PV) system is described. The program, called PVC, was written in visual basic for windows and aimed as a tool for studying individual cell characteristics as well as PV system as a whole. This paper describes the mathematical models used in the program, an overview of the program and its application in analyzing a BP275 PV system. (author)

  4. Computational Music Structure Analysis (Dagstuhl Seminar 16092)

    OpenAIRE

    Müller, Meinard; Chew, Elaine; Bello, Juan Pablo

    2016-01-01

    Music is a ubiquitous and vital part of the lives of billions of people worldwide. Musical creations and performances are among the most complex and intricate of our cultural artifacts, and the emotional power of music can touch us in surprising and profound ways. In view of the rapid and sustained growth of digital music sharing and distribution, the development of computational methods to help users find and organize music information has become an important field of research in both indust...

  5. Analysis of computed tomography of ovarian tumor

    Energy Technology Data Exchange (ETDEWEB)

    Omura, Makoto; Taniike, Keiko; Nishiguchi, Hiroyasu

    1987-07-01

    One hundred and twenty six patients with ovarian mass were studied with computed tomography (CT) and classified into five groups according to its margin and inner structure. The incidence of malignancy of cystic ovarian mass with smooth margin was low and that of solid ovarian mass with irreglar margin was high. Three cases (6.7 %) of malignant ovarian tumor demonstrated completely cystic pattern. Ovarian teratomas contained well defined component of fat density.

  6. Computer-aided Analysis of Phisiological Systems

    OpenAIRE

    Balázs Benyó

    2007-01-01

    This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and ...

  7. Affect and Learning: a computational analysis

    OpenAIRE

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation to an artificial agent in a reinforcement learning setting. We have done a range of different experiments to study the effect of affect on learning, including the effect on learning if affect is us...

  8. Parameterized complexity analysis in computational biology.

    Science.gov (United States)

    Bodlaender, H L; Downey, R G; Fellows, M R; Hallett, M T; Wareham, H T

    1995-02-01

    Many computational problems in biology involve parameters for which a small range of values cover important applications. We argue that for many problems in this setting, parameterized computational complexity rather than NP-completeness is the appropriate tool for studying apparent intractability. At issue in the theory of parameterized complexity is whether a problem can be solved in time O(n alpha) for each fixed parameter value, where alpha is a constant independent of the parameter. In addition to surveying this complexity framework, we describe a new result for the Longest Common Subsequence problem. In particular, we show that the problem is hard for W[t] for all t when parameterized by the number of strings and the size of the alphabet. Lower bounds on the complexity of this basic combinatorial problem imply lower bounds on more general sequence alignment and consensus discovery problems. We also describe a number of open problems pertaining to the parameterized complexity of problems in computational biology where small parameter values are important. PMID:7796275

  9. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  10. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  11. Computational Neural Networks: A New Paradigm for Spatial Analysis

    OpenAIRE

    Fischer, M.M.

    1996-01-01

    In this paper a systematic introduction to computational neural network models is given in order to help spatial analysts learn about this exciting new field. The power of computational neural networks viz-à-viz conventional modelling is illustrated for an application field with noisy data of limited record length: spatial interaction modelling of telecommunication data in Austria. The computational appeal of neural networks for solving some fundamental spatial analysis problems is summarized...

  12. Alan Turing and the foundations of computable analysis

    OpenAIRE

    Gherardi, Guido

    2011-01-01

    We investigate Turing's contributions to computability theory for real numbers and real functions presented in [22, 24, 26]. In particular, it is shown how two fundamental approaches to computable analysis, the so-called ‘Type-2 Theory of Effectivity' (TTE) and the ‘realRAM machine' model, have their foundations in Turing's work, in spite of the two incompatible notions of computability they involve. It is also shown, by contrast, how the modern conceptual tools provided by the...

  13. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  14. The Reliability of Content Analysis of Computer Conference Communication

    Science.gov (United States)

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  15. The Effect of Computer Assisted Instruction on Elementary Reading and Writing Achievement

    Directory of Open Access Journals (Sweden)

    H. Gülhan ORHAN KARSAK

    2014-01-01

    Full Text Available The research investigated the effect of computer assisted instruction (CAI on elementary reading and writing achievement (ERWA. The sample consisted of 64 first graders (32 in the experimental group and 32 in the control group in the 2006-2007 academic year. This quasi-experimental study had a posttest only control group design and was conducted during the first semester. The experimental group was taught by CAI and the control group was taught by traditional instruction. Data were gathered through ‘Parent Questionnaire’, ‘Reading Concepts Scale’, ‘Achievement Test’, ‘Reading and Handwriting Observation Form’ and analyzed by chi-square, frequency and t test through SPSS 12.0. The main findings of the study were as follows: (1 CAI affected first graders’ handwriting, reading fluency and punctuation, (2 CAI didn’t affect their writing and reading comprehension, (3 CAI affected ERWA of those who did not have any computer at home.

  16. Analysis of service-oriented computing systems

    OpenAIRE

    Ivanovic, Dragan

    2013-01-01

    La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), co...

  17. Computer-aided Analysis of Phisiological Systems

    Directory of Open Access Journals (Sweden)

    Balázs Benyó

    2007-12-01

    Full Text Available This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and stateclassification to detect brain ischemia by means of EEG signal processing; Detection ofbreathing disorders like apnea and hypopnea; Molecular biology studies with DNA-chips;Evaluation of the cry of normal hearing and hard of hearing infants.

  18. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  19. The symbolic computation and automatic analysis of trajectories

    Science.gov (United States)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  20. Atomic physics: computer calculations and theoretical analysis

    OpenAIRE

    Drukarev, E. G.

    2004-01-01

    It is demonstrated, how the theoretical analysis preceding the numerical calculations helps to calculate the energy of the ground state of helium atom, and enables to avoid qualitative errors in the calculations of the characteristics of the double photoionization.

  1. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  2. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  3. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  4. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K.; Riha, David S.; Thacker, Ben H.

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  5. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  6. VLF radio propagation conditions. Computational analysis techniques

    International Nuclear Information System (INIS)

    Complete text of publication follows. Very low frequency (VLF) radio waves propagate within the Earth-ionosphere waveguide with very little attenuation. Modifications of the waveguide geometry effect the propagation conditions, and hence, the attenuation. Changes in the ionosphere, such as the presence of the D-region during the day, or the precipitation of energetic particles, are the main causes of this modification. Using narrowband receivers monitoring VLF transmitters, the amplitude and phase of these signals are recorded. Multivariate data analysis techniques, namely Principal Component Analysis (PCA) and Singular Spectrum Analysis (SSA), are applied to the data in order to determine parameters, such as seasonal and diurnal changes, affecting the variation of these signals. Transient effects may then be easier to detect.

  7. THE EFFECTIVENESS OF COMPUTER ASSISTED INSTRUCTION IN TEACHING PHYSICAL SCIENCES IN SECONDARY SCHOOL OF THE RURAL AREA OF BURDWAN DISTRICT, WEST BENGAL.

    Directory of Open Access Journals (Sweden)

    Basudeb Roy Chaudhury

    2014-04-01

    Full Text Available This Experimental study compared academic performance of students in class- X (ten in one of the Bengali Medium School of rural area of Burdwan District, West Bengal , India between traditional instruction, and Computer Assisted Instruction with simultaneous discussion. The design used in this study was pre-test and post-test to control group and experiment group. Fifty students of class-x were selected and two groups were formed. Students of each group were selected randomly. Statistical data analysis was used in data analysis. Significant difference was found in the post test scores of students receiving traditional method, and CAI with simultaneous discussion. It revealed that CAI with simultaneous discussion is more effective than traditional method

  8. Computer modeling for neutron activation analysis methods

    International Nuclear Information System (INIS)

    Full text: The INP AS RU develops databases for the neutron-activation analysis - ND INAA [1] and ELEMENT [2]. Based on these databases, the automated complex is under construction aimed at modeling of methods for natural and technogenic materials analysis. It is well known, that there is a variety of analysis objects with wide spectra, different composition and concentration of elements, which makes it impossible to develop universal methods applicable for every analytical research. The modelling is based on algorithm, that counts the period of time in which the sample was irradiated in nuclear reactor, providing the sample's total absorption and activity analytical peaks areas with given errors. The analytical complex was tested for low-elemental analysis (determination of Fe and Zn in vegetation samples, and Cu, Ag and Au - in technological objects). At present, the complex is applied for multielemental analysis of sediment samples. In this work, modern achievements in the analytical chemistry (measurement facilities, high-resolution detectors, IAEA and IUPAC databases) and information technology applications (Java software, database management systems (DBMS), internet technologies) are applied. Reference: 1. Tillaev T., Umaraliev A., Gurvich L.G., Yuldasheva K., Kadirova J. Specialized database for instrumental neutron activation analysis - ND INAA 1.0, The 3-rd Eurasian Conference Nuclear Science and its applications, 2004, pp.270-271.; 2. Gurvich L.G., Tillaev T., Umaraliev A. The Information-analytical database on the element contents of natural objects. The 4-th International Conference Modern problems of Nuclear Physics, Samarkand, 2003, p.337. (authors)

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. Limited subsolidus diffusion in type B1 CAI: Evidence from Ti distribution in spinel

    Science.gov (United States)

    Meeker, G. P.; Quick, J. E.; Paque, Julie M.

    1993-01-01

    Most models of calcium aluminum-rich inclusions (CAI) have focused on early stages of formation by equilibrium crystallization of a homogeneous liquid. Less is known about the subsolidus cooling history of CAI. Chemical and isotopic heterogeneties on a scale of tens to hundreds of micrometers (e.g. MacPherson et al. (1989) and Podosek, et al. (1991)) suggest fairly rapid cooling with a minimum of subsolidus diffusion. However, transmission electron microscopy indicates that solid state diffusion may have been an important process at a smaller scale (Barber et al. 1984). If so, chemical evidence for diffusion could provide constraints on cooling times and temperatures. With this in mind, we have begun an investigation of the Ti distribution in spinels from two type B1 CAI from Allende to determine if post-crystallization diffusion was a significant process. The type B1 CAIs, 3529Z and 5241 have been described by Podosek et al. (1991) and by El Goresy et al. (1985) and MacPherson et al. (1989). We have analyzed spinels in these inclusions using the electron microprobe. These spinels are generally euhedral, range in size from less than 10 to 15 micron and are poikilitically enclosed by millimeter-sized pyroxene, melilite, and anorthite. Analyses were obtained from both the mantles and cores of the inclusions. Compositions of pyroxene in the vicinity of individual spinel grains were obtained by analyzing at least two points on opposite sides of the spinel and averaging the compositions. The pyroxene analyses were obtained within 15 microns of the spinel-pyroxene interface. No compositional gradients were observed within single spinel crystals. Ti concentrations in spinels included within pyroxene, melilite, and anorthite are presented.

  11. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  12. COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS

    OpenAIRE

    2010-01-01

    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  13. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m−2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  14. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    OpenAIRE

    Braun, Klaus; Frank, Martin; Pipkorn, Rüdiger; Reed, Jennifer; Spring, Herbert; Debus, Jürgen; Didinger, Bernd; von der Lieth, Claus-Wilhelm; Wiessler, Manfred; Waldeck, Waldemar

    2008-01-01

    The Human immunodeficiency virus 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal las...

  15. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    OpenAIRE

    Klaus Braun, Martin Frank, Rüdiger Pipkorn, Jennifer Reed, Herbert Spring, Jürgen Debus, Bernd Didinger, Claus-Wilhelm von der Lieth, Manfred Wiessler, Waldemar Waldeck

    2008-01-01

    The Human immunodeficiency 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser sca...

  16. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil)

    OpenAIRE

    Alves Rômulo RN; Hanazaki Natalia; Begossi Alpina

    2009-01-01

    Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers), with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 nati...

  17. Adaptive computational methods for aerothermal heating analysis

    Science.gov (United States)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  18. Local spatial frequency analysis for computer vision

    Science.gov (United States)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  19. Interactive Computer Lessons for Introductory Economics: Guided Inquiry-From Supply and Demand to Women in the Economy.

    Science.gov (United States)

    Miller, John; Weil, Gordon

    1986-01-01

    The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)

  20. Computer-Assisted Education System for Psychopharmacology.

    Science.gov (United States)

    McDougall, William Donald

    An approach to the use of computer assisted instruction (CAI) for teaching psychopharmacology is presented. A project is described in which, using the TUTOR programing language on the PLATO IV computer system, several computer programs were developed to demonstrate the concepts of aminergic transmitters in the central nervous system. Response…

  1. Crystal structures of hydrates of simple inorganic salts. II. Water-rich calcium bromide and iodide hydrates: CaBr2 · 9H2O, CaI2 · 8H2O, CaI2 · 7H2O and CaI2 · 6.5H2O.

    Science.gov (United States)

    Hennings, Erik; Schmidt, Horst; Voigt, Wolfgang

    2014-09-01

    Single crystals of calcium bromide enneahydrate, CaBr(2) · 9H2O, calcium iodide octahydrate, CaI(2) · 8H2O, calcium iodide heptahydrate, CaI(2) · 7H2O, and calcium iodide 6.5-hydrate, CaI(2) · 6.5H2O, were grown from their aqueous solutions at and below room temperature according to the solid-liquid phase diagram. The crystal structure of CaI(2) · 6.5H2O was redetermined. All four structures are built up from distorted Ca(H2O)8 antiprisms. The antiprisms of the iodide hydrate structures are connected either via trigonal-plane-sharing or edge-sharing, forming dimeric units. The antiprisms in calcium bromide enneahydrate are monomeric. PMID:25186361

  2. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    International Nuclear Information System (INIS)

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  3. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  4. Computer-aided safety analysis of computer-controlled systems : a case example

    OpenAIRE

    Biegert, Uwe

    2000-01-01

    Computer controlled systems consist of a complex interaction between technical process, human task and software. For the development of safety critical systems new method are required, which not only consider one of these parts of a computer-controlled system. In this paper a qualitative modeling method is presented. The method is called SQMA, Situationbased Qualitative Modeling and Analysis and it origin goes back to Qualitative Reasoning. First, all parts of a system are modeled separated a...

  5. Oxygen isotopes in the early protoplanetary disk inferred from pyroxene in a classical type B CAI

    Science.gov (United States)

    Aléon, Jérôme

    2016-04-01

    A major unanswered question in solar system formation is the origin of the oxygen isotopic dichotomy between the Sun and the planets. Individual Calcium-Aluminum-rich inclusions (CAIs) from CV chondrites exhibit almost the full isotopic range, but how their composition evolved is still unclear, which prevents robust astrochemical conclusions. A key issue is notably the yet unsolved origin of the 16O-rich isotopic composition of pyroxene in type B CAIs. Here, I report an in-situ oxygen isotope study of the archetypal type B CAI USNM-3529-Z from Allende with emphasis on the isotopic composition of pyroxene and its isotopic and petrographic relationships with other major minerals. The O isotopic composition of pyroxene is correlated with indicators of magmatic growth, indicating that the pyroxene evolved from a 16O-poor composition and became progressively enriched in 16O during its crystallization, contrary to the long held assumption that pyroxene was initially 16O-rich. This variation is well explained by isotopic exchange between a 16O-poor partial melt having the isotopic composition of melilite and a 16O-rich gas having the isotopic composition of spinel, during pyroxene crystallization. The isotopic evolution of 3529-Z is consistent with formation in an initially 16O-rich environment where spinel and gehlenitic melilite crystallized, followed by a 16O-depletion associated with melilite partial melting and recrystallization and finally a return to the initial 16O-rich environment before pyroxene crystallization. This strongly suggests that the environment of CAI formation was globally 16O-rich, with local 16O-depletions systematically associated with high temperature events. The Al/Mg isotopic systematics of 3529-Z further indicates that this suite of isotopic changes occurred in the first 150 000 yr of the solar system, during the main CAI formation period. A new astrophysical setting is proposed, where the 16O-depletion occurs in an optically thin surface

  6. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  7. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  8. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. Computer programs for analysis of geophysical data

    International Nuclear Information System (INIS)

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  12. Structural Analysis Using Computer Based Methods

    Science.gov (United States)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  13. Computer aided information system for a PWR

    International Nuclear Information System (INIS)

    The computer aided information system (CAIS) is designed with a view to improve the performance of the operator. CAIS assists the plant operator in an advisory and support role, thereby reducing the workload level and potential human errors. The CAIS as explained here has been designed for a PWR type KLT- 40 used in Floating Nuclear Power Stations (FNPS). However the underlying philosophy evolved in designing the CAIS can be suitably adopted for other type of nuclear power plants too (BWR, PHWR). Operator information is divided into three broad categories: a) continuously available information b) automatically available information and c) on demand information. Two in number touch screens are provided on the main control panel. One is earmarked for continuously available information and the other is dedicated for automatically available information. Both the screens can be used at the operator's discretion for on-demand information. Automatically available information screen overrides the on-demand information screens. In addition to the above, CAIS has the features of event sequence recording, disturbance recording and information documentation. CAIS design ensures that the operator is not overburdened with excess and unnecessary information, but at the same time adequate and well formatted information is available. (author). 5 refs., 4 figs

  14. Sensitivity analysis for computational models of biochemical systems

    OpenAIRE

    Maj,

    2014-01-01

    Systems biology is an integrated area of science which aims at the analysis of biochemical systems using an holistic perspective. In this context, sensitivity analysis, a technique studying how the output variation of a computational model can be associated to its input state plays a pivotal role. In the thesis it is described how to properly apply the different sensitivity analysis techniques according to the specific case study (i.e., continuous deterministic rather than discrete stochastic...

  15. Benefits of Computer Based Content Analysis to Foresight

    OpenAIRE

    Kováříková, Ludmila; Grosová, Stanislava

    2014-01-01

    Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of the content analysis for the foresight projects are discussed as well. Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis within the foresight, results of the generation phase of a particular foresight project perf...

  16. Analysis of the Naval Postgraduate School computer network architecture

    OpenAIRE

    Wiedenhoeft, Paul Eric

    1994-01-01

    The computer network on the Naval Postgraduate School campus has become an integral part of the operations of the Naval Postgraduate School organization. An analysis of the network architecture will help formulate strategic plans that will support the network and the Naval Postgraduate School to the end of the century. This study describes the Naval Postgraduate School computer network architecture, driving forces, imitations, and possible measures of network benefits. It considers network al...

  17. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  18. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  19. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  20. Parallel computation of seismic analysis of high arch dam

    Institute of Scientific and Technical Information of China (English)

    Chen Houqun; Ma Huaifa; Tu Jin; Cheng Guangqing; Tang Juzhen

    2008-01-01

    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.

  1. OXYGEN ISOTOPIC COMPOSITIONS OF THE ALLENDE TYPE C CAIs: EVIDENCE FOR ISOTOPIC EXCHANGE DURING NEBULAR MELTING AND ASTEROIDAL THERMAL METAMORPHISM

    Energy Technology Data Exchange (ETDEWEB)

    Krot, A N; Chaussidon, M; Yurimoto, H; Sakamoto, N; Nagashima, K; Hutcheon, I D; MacPherson, G J

    2008-02-21

    Based on the mineralogy and petrography, coarse-grained, igneous, anorthite-rich (Type C) calcium-aluminum-rich inclusions (CAIs) in the CV3 carbonaceous chondrite Allende have been recently divided into three groups: (i) CAIs with melilite and Al,Ti-diopside of massive and lacy textures (coarse grains with numerous rounded inclusions of anorthite) in a fine-grained anorthite groundmass (6-1-72, 100, 160), (ii) CAI CG5 with massive melilite, Al,Ti-diopside and anorthite, and (iii) CAIs associated with chondrule material: either containing chondrule fragments in their peripheries (ABC, TS26) or surrounded by chondrule-like, igneous rims (93) (Krot et al., 2007a,b). Here, we report in situ oxygen isotopic measurements of primary (melilite, spinel, Al,Ti-diopside, anorthite) and secondary (grossular, monticellite, forsterite) minerals in these CAIs. Spinel ({Delta}{sup 17}O = -25{per_thousand} to -20{per_thousand}), massive and lacy Al,Ti-diopside ({Delta}{sup 17}O = -20{per_thousand} to -5{per_thousand}) and fine-grained anorthite ({Delta}{sup 17}O = -15{per_thousand} to -2{per_thousand}) in 100, 160 and 6-1-72 are {sup 16}O-enriched relative spinel and coarse-grained Al,Ti-diopside and anorthite in ABC, 93 and TS26 ({Delta}{sup 17}O ranges from -20{per_thousand} to -15{per_thousand}, from -15{per_thousand} to -5{per_thousand}, and from -5{per_thousand} to 0{per_thousand}, respectively). In 6-1-72, massive and lacy Al,Ti-diopside grains are {sup 16}O-depleted ({Delta}{sup 17}O {approx} -13{per_thousand}) relative to spinel ({Delta}{sup 17}O = -23{per_thousand}). Melilite is the most {sup 16}O-depleted mineral in all Allende Type C CAIs. In CAI 100, melilite and secondary grossular, monticellite and forsterite (minerals replacing melilite) are similarly {sup 16}O-depleted, whereas grossular in CAI 160 is {sup 16}O-enriched ({Delta}{sup 17}O = -10{per_thousand} to -6{per_thousand}) relative to melilite ({Delta}{sup 17}O = -5{per_thousand} to -3{per_thousand}). We infer

  2. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.)

  3. Computer-Assisted Learning Design for Reflective Practice Supporting Multiple Learning Styles for Education and Training in Pre-Hospital Emergency Care.

    Science.gov (United States)

    Jones, Indra; Cookson, John

    2001-01-01

    Students in paramedic education used a model combining computer-assisted instruction (CAI), reflective practice, and learning styles. Although reflective practice normally requires teacher-student interaction, CAI with reflective practice embedded enabled students to develop learning style competencies and achieve curricular outcomes. (SK)

  4. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  5. Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection

    Science.gov (United States)

    Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok

    2011-11-01

    Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.

  6. Computational Analysis of the SRS Phase III Salt Disposition Alternatives

    International Nuclear Information System (INIS)

    Completion of the Phase III evaluation and comparison of salt disposition alternatives was supported with enhanced computer models and analysis for each case on the ''short list'' of four options. SPEEDUP(TM) models and special purpose models describing mass and energy balances and flow rates were developed and used to predict performance and production characteristics for each of the options. Results from the computational analysis were a key part of the input used to select a primary and an alternate salt disposition alternative

  7. Computer analysis of failures in nuclear power station

    International Nuclear Information System (INIS)

    Computer analysis of minor failures at nuclear power plants have been carried out in the Institute for Electrical Power Research (VEIKI) since 1976. The research work was mainly directed to the computer based application of methods to be used at the Paks Nuclear Power Plant; the proposed procedures, however, can also be used at traditional power plants. The paper describes the general aims and main steps of failure analysis and summarizes the state of the art and perspectives of R and D in Hungary. (N.I.)

  8. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  9. Superfast robust digital image correlation analysis with parallel computing

    Science.gov (United States)

    Pan, Bing; Tian, Long

    2015-03-01

    Existing digital image correlation (DIC) using the robust reliability-guided displacement tracking (RGDT) strategy for full-field displacement measurement is a path-dependent process that can only be executed sequentially. This path-dependent tracking strategy not only limits the potential of DIC for further improvement of its computational efficiency but also wastes the parallel computing power of modern computers with multicore processors. To maintain the robustness of the existing RGDT strategy and to overcome its deficiency, an improved RGDT strategy using a two-section tracking scheme is proposed. In the improved RGDT strategy, the calculated points with correlation coefficients higher than a preset threshold are all taken as reliably computed points and given the same priority to extend the correlation analysis to their neighbors. Thus, DIC calculation is first executed in parallel at multiple points by separate independent threads. Then for the few calculated points with correlation coefficients smaller than the threshold, DIC analysis using existing RGDT strategy is adopted. Benefiting from the improved RGDT strategy and the multithread computing, superfast DIC analysis can be accomplished without sacrificing its robustness and accuracy. Experimental results show that the presented parallel DIC method performed on a common eight-core laptop can achieve about a 7 times speedup.

  10. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  11. Computer analysis of thermal hydraulics for nuclear reactor safety

    International Nuclear Information System (INIS)

    This paper gives an overview of ANSTO's capability and recent research and development activities in thermal hydraulic modelling for nuclear reactor safety analysis, particularly for our research reactor, HIFAR (High Flux Australian Reactor) and its intended replacement, the Replacement Research Reactor (RRR). Several tools contribute to ANSTO's capability in thermal hydraulic modelling, including RELAP (developed in US) - a code for reactor system thermal-hydraulic analysis; CFS (developed in UK) - a general computational fluid dynamics code , which was used for thermal hydraulic analysis in reactor fuel elements; and HIZAPP (developed at ANSTO) - for coupling neutronics with thermal-hydraulics for reactor transient analysis

  12. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  13. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  14. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in process data apparently are available. PMID:24274405

  15. Brain Computer Interface Enhancement by Independent Component Analysis

    Czech Academy of Sciences Publication Activity Database

    Bobrov, P.; Frolov, A. A.; Húsek, Dušan

    Heidelberg: Springer, 2013 - (Kudělka, M.; Pokorný, J.; Snášel, V.; Abraham, A.), s. 51-60. (Advances in Intelligent Systems and Computing. 179). ISBN 978-3-642-31602-9. ISSN 2194-5357. [IHCI 2011. International Conference on Intelligent Human Computer Interaction /3./. Prague (CZ), 29.08.2011-31.08.2011] R&D Projects: GA ČR GAP202/10/0262; GA ČR GA205/09/1079 Grant ostatní: GA MŠk(CZ) ED1.1.00/02.0070 Institutional research plan: CEZ:AV0Z10300504 Keywords : brain computer interface * EEG patterns classiffication * independent component analysis * classification accuracy * m-rythm identification Subject RIV: IN - Informatics, Computer Science

  16. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  17. Problem Solving Process Research of Everyone Involved in Innovation Based on CAI Technology

    Science.gov (United States)

    Chen, Tao; Shao, Yunfei; Tang, Xiaowo

    It is very important that non-technical department personnel especially bottom line employee serve as innovators under the requirements of everyone involved in innovation. According the view of this paper, it is feasible and necessary to build everyone involved in innovation problem solving process under Total Innovation Management (TIM) based on the Theory of Inventive Problem Solving (TRIZ). The tools under the CAI technology: How TO mode and science effects database could be very useful for all employee especially non-technical department and bottom line for innovation. The problem solving process put forward in the paper focus on non-technical department personnel especially bottom line employee for innovation.

  18. The Clinical Experiences of Dr.CAI Gan in Treating Chronic Constipation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zheng-li; ZHU Mei-ping; LIU Qun; LEI Yun-xia

    2009-01-01

    @@ Prof.CAI Gan (蔡淦) is an academic leader in TCM treatment of the spleen and stomach disease.He insisted that liver depression, spleen deficiency and poor nourishment of the intestines are the core of pathogenesis for chronic constipation.Therefore he often treats the disease by strengthening the spleen,relieving the depressed liver, nourishing yin and moistening the intestines.Meanwhile he attaches great importance to syndrome differentiation and comprehensive regulation and treatment.As a result,good therapeutic effects are often achieved.The authors summarized his ways for treating chronic constipation with the following 10 methods, which are introduced below.

  19. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  20. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  1. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  2. Computer codes for safety analysis of Indian PHWRs

    International Nuclear Information System (INIS)

    Computer codes for safety analysis of PHWRs have been developed in India over the years. Some of the codes that have been developed in NPC are discussed in this paper. Computer code THYNAC and ATMIKA have been developed in NPC for the analysis of LOCA scenario. Both the codes are based on UVET model using three equations and slip correlations. Computer code ATMIKA is an improved version of code THYNAC with regard to numerics and flexibility in modelling. Apart from thermal hydraulic model these codes also include point neutron kinetics model. Codes COOLTMP and RCOMP are used to estimate heat-up of primary coolant and core components respectively under off-normal shutdown conditions as may be existing during special maintenance job or postulated failure. Code validations have been performed either against experiments or the published results of experiments performed elsewhere, or through International benchmark exercises sponsored by IAEA. The paper discusses these codes, their validations and salient applications

  3. Finite element dynamic analysis on CDC STAR-100 computer

    Science.gov (United States)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  4. Interactive computer code for dynamic and soil structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mulliken, J.S.

    1995-12-01

    A new interactive computer code is presented in this paper for dynamic and soil-structure interaction (SSI) analyses. The computer program FETA (Finite Element Transient Analysis) is a self contained interactive graphics environment for IBM-PC`s that is used for the development of structural and soil models as well as post-processing dynamic analysis output. Full 3-D isometric views of the soil-structure system, animation of displacements, frequency and time domain responses at nodes, and response spectra are all graphically available simply by pointing and clicking with a mouse. FETA`s finite element solver performs 2-D and 3-D frequency and time domain soil-structure interaction analyses. The solver can be directly accessed from the graphical interface on a PC, or run on a number of other computer platforms.

  5. Qualitative Research and Computer Analysis: New Challenges and Opportunities

    OpenAIRE

    Yuen, AHK

    2000-01-01

    The use of computers for Qualitative Data Analysis (QDA) in qualitative research has been growing rapidly in the last decade. QDA programs are software packages developed explicitly for the purpose of analyzing qualitative data. A range of different kinds of program is available for the handling and analysis of qualitative data, such as Atlas/ti, HyperRESEARCH, and NUD*IST. With the development of new technologies, the QDA software has advanced from the efficient code-and-retrieve ability to ...

  6. Computer automated movement detection for the analysis of behavior

    OpenAIRE

    Ramazani, Roseanna B.; Harish R Krishnan; BERGESON, SUSAN E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtractio...

  7. Computational Methods for the Analysis of Array Comparative Genomic Hybridization

    Directory of Open Access Journals (Sweden)

    Raj Chari

    2006-01-01

    Full Text Available Array comparative genomic hybridization (array CGH is a technique for assaying the copy number status of cancer genomes. The widespread use of this technology has lead to a rapid accumulation of high throughput data, which in turn has prompted the development of computational strategies for the analysis of array CGH data. Here we explain the principles behind array image processing, data visualization and genomic profile analysis, review currently available software packages, and raise considerations for future software development.

  8. Computational models for the nonlinear analysis of reinforced concrete plates

    Science.gov (United States)

    Hinton, E.; Rahman, H. H. A.; Huq, M. M.

    1980-01-01

    A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.

  9. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  10. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discusses the structure, performance and applications of the system

  11. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    Science.gov (United States)

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  12. Computer-aided design and analysis of mechanisms

    Science.gov (United States)

    Knight, F. L.

    1982-01-01

    An introduction to the computer programs developed to assist in the design and analysis of mechanisms is presented. A survey of the various types of programs which are available is given, and the most widely used programs are compared. The way in which the programs are used is discussed, and demonstrated with an example.

  13. Computing support for advanced medical data analysis and imaging

    CERN Document Server

    Wiślicki, W; Białas, P; Czerwiński, E; Kapłon, Ł; Kochanowski, A; Korcyl, G; Kowal, J; Kowalski, P; Kozik, T; Krzemień, W; Molenda, M; Moskal, P; Niedźwiecki, S; Pałka, M; Pawlik, M; Raczyński, L; Rudy, Z; Salabura, P; Sharma, N G; Silarski, M; Słomski, A; Smyrski, J; Strzelecki, A; Wieczorek, A; Zieliński, M; Zoń, N

    2014-01-01

    We discuss computing issues for data analysis and image reconstruction of PET-TOF medical scanner or other medical scanning devices producing large volumes of data. Service architecture based on the grid and cloud concepts for distributed processing is proposed and critically discussed.

  14. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, Vladimir

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  15. The NASA NASTRAN structural analysis computer program - New content

    Science.gov (United States)

    Weidman, D. J.

    1978-01-01

    Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.

  16. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  17. Interactive computer system for analysis of dynamic renal studies

    International Nuclear Information System (INIS)

    An interactive computer system is described for a small minicomputer to be used in the evaluation of radionuclide scintiscanning studies of renal transplants and other dynamic kidney function studies. The package consists of programs for data acquisition, analysis, and report generation. As an added feature, the program dissociates the kidney view into total kidney, cortical, and medullar components

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. HIV-1 Capsid Assembly Inhibitor (CAI Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    Directory of Open Access Journals (Sweden)

    Klaus Braun, Martin Frank, Rüdiger Pipkorn, Jennifer Reed, Herbert Spring, Jürgen Debus, Bernd Didinger, Claus-Wilhelm von der Lieth, Manfred Wiessler, Waldemar Waldeck

    2008-01-01

    Full Text Available The Human immunodeficiency 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser scanning microscopy (CLSM, could lead to a loss of function by changing the molecule's structure. Molecular dynamics (MD simulations and circular dichroism (CD studies suggest that the CAI-peptide has an intrinsic capacity to form a helical structure, which seems to be critical for the pharmacological effect as revealed by intensive docking calculations and comparison with control peptides. This coupling of the CAI-peptide to a BioShuttle-molecule additionally improved its solubility. Under the conditions described, the HIV-1 CAI peptide was transported into living cells and could be localized in the vicinity of the mitochondria.

  20. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    Science.gov (United States)

    Braun, Klaus; Frank, Martin; Pipkorn, Rüdiger; Reed, Jennifer; Spring, Herbert; Debus, Jürgen; Didinger, Bernd; von der Lieth, Claus-Wilhelm; Wiessler, Manfred; Waldeck, Waldemar

    2008-01-01

    The Human immunodeficiency virus 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser scanning microscopy (CLSM), could lead to a loss of function by changing the molecule's structure. Molecular dynamics (MD) simulations and circular dichroism (CD) studies suggest that the CAI-peptide has an intrinsic capacity to form a helical structure, which seems to be critical for the pharmacological effect as revealed by intensive docking calculations and comparison with control peptides. This coupling of the CAI-peptide to a BioShuttle-molecule additionally improved its solubility. Under the conditions described, the HIV-1 CAI peptide was transported into living cells and could be localized in the vicinity of the mitochondria. PMID:18695744

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  2. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  6. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  7. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    Science.gov (United States)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  8. CAR: A MATLAB Package to Compute Correspondence Analysis with Rotations

    Directory of Open Access Journals (Sweden)

    Urbano Lorenzo-Seva Rovira

    2009-09-01

    Full Text Available Correspondence analysis (CA is a popular method that can be used to analyse relationships between categorical variables. Like principal component analysis, CA solutions can be rotated both orthogonally and obliquely to simple structure without affecting the total amount of explained inertia. We describe a MATLAB package for computing CA. The package includes orthogonal and oblique rotation of axes. It is designed not only for advanced users of MATLAB but also for beginners. Analysis can be done using a user-friendly interface, or by using command lines. We illustrate the use of CAR with one example.

  9. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  10. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  11. On the dependence of HCDA- analysis results on computing systems

    International Nuclear Information System (INIS)

    Hypothetical core disruptive accident (HCDA) analysis of Liquid Metal Fast Breeder Reactors (LMFBRs) involves solving of reactor kinetics equations simultaneously with energy balance equations, equation of state providing internal pressure, hydrodynamic equations providing material displacement and reactivity feedback (Doppler and displacement) equations. VENUS-II computer code developed at ANL is one of the codes that solves such equations and provides results for time history of power, energy release, pressures and temperature buildups during the disassembly of the reactor core. A study was undertaken to ascertain whether the results of the code VENUS-II are computing-system dependent. (author). 2 refs., 3 tabs

  12. Computer Analysis Of ILO Standard Chest Radiographs Of Pneumoconiosis

    Science.gov (United States)

    Li, C. C.; Shu, David B. C.; Tai, H. T.; Hou, W.; Kunkle, G. A.; Wang, Y.; Hoy, R. J.

    1982-11-01

    This paper presents study of computer analysis of the 1980 ILO standard chest radiographs of pneumoconiosis. Algorithms developed for detection of individual small rounded and irregular opacities have been experimented and evaluated on these standard radiographs. The density, shape, and size distribution of the detected objects in the lung field, in spite of false positives, can be used as indicators for the beginning of pneumoconiosis. This approach is potentially useful in computer-assisted screening and early detection process where the annual chest radiograph of each worker is compared with his (her) own normal radiograph obtained previously.

  13. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  14. Supra-canonical 26Al/27Al and the residence time of CAIs in the solar protoplanetary disk.

    Science.gov (United States)

    Young, Edward D; Simon, Justin I; Galy, Albert; Russell, Sara S; Tonui, Eric; Lovera, Oscar

    2005-04-01

    The canonical initial 26Al/27Al ratio of 4.5 x 10(-5) has been a fiducial marker for the beginning of the solar system. Laser ablation and whole-rock multiple-collector inductively coupled plasma-source mass spectrometry magnesium isotope analyses of calcium- and aluminum-rich inclusions (CAIs) from CV3 meteorites demonstrate that some CAIs had initial 26Al/27Al values at least 25% greater than canonical and that the canonical initial 26Al/27Al cannot mark the beginning of solar system formation. Using rates of Mg diffusion in minerals, we find that the canonical initial 26Al/27Al is instead the culmination of thousands of brief high-temperature events incurred by CAIs during a 10(5)-year residence time in the solar protoplanetary disk. PMID:15746387

  15. A braça da rede, uma técnica caiçara de medir

    OpenAIRE

    Gilberto Chieus Jr.

    2009-01-01

    Este artigo relata como os caiçaras da cidade de Ubatuba litoral norte paulista medem suas redes de pesca. Mas antes de estar analisando sua técnica de medir estaremos fazendo uma pequena abordagem da cultura caiçara e suas transformações. Em seguida mostraremos alguns momentos históricos da construção do metro. Depois como os caiçaras medem suas redes e o problema ocorrido no Brasil na implantação do sistema métrico decimal e a resistência de determinadas civilizações que se utiliza de outro...

  16. [Demotion and promotion of CAI Jing and the Medicine School's establishment and abolition three times in the North Song dynasty].

    Science.gov (United States)

    Li, Yu-Qing

    2011-03-01

    CAI Jing was appointed the prime minister in the Chongning period of Song Hui-tsung. The Medical School was moved to the Imperial College from Taichang Temple. It used a 3-year education system and divided graduates into three grades. The preferential policies promised top students as 8 or 9-rank official position, which attracted a lot of intellectuals into the field of medicine. CAI Jing was demoted three times respectively in the fifth year of the Chongning period, the third year of the Daguan period and the second year of the Zhenghe period, and was promoted again after each demotion. Influenced by changes of CAI Jing's position and relative policies, the Medical School was also established and abolished three times. PMID:21624269

  17. Characterizing Pyroxene Reaction Space in Calcium-Aluminum Rich Inclusions: Oxidation During CAI Rim Formation

    Science.gov (United States)

    Dyl, K. A.; Young, E. D.

    2009-12-01

    We define the reaction space that controls changes in pyroxene composition in CAIs and Wark-Lovering (WL) rims in an oxidizing solar nebula. Ti-rich pyroxenes in CAIs record a sub-solar oxygen fugacity (Ti3+/Ti4+~1.5). WL rim pyroxenes in the CAI Leoville 144A have a distinctly lower oxidation state.This difference supports WL rim condensation in an environment of increasing O2(g) and Mg(g) (Simon et al. 2005). We used the following phase components to identify four linearly independent reactions (Thompson 1982): diopside, CaTs (Al2Mg-1Si-1), T3 (Ti3+AlMg-1Si-1), T4 (Ti4+Al2Mg-1Si-2), En (MgCa-1), perovskite, O(g), Mg(g), SiO(g), and Ca(g). Compositional variation in this system is dominated by two reactions. The first is oxidation of Ti3+ via reaction with O and Mg in the gas phase: 1.5 O(g) + Mg(g) → ¼ Di + [Ti4+Mg3/4Ti3+-1Ca-1/4Si-1/2] (1). Pyroxene is produced and En is introduced. The second reaction (2) is perovskite formation. It is observed in the WL rim of Leoville 144A, and experiments confirm that an elevated Ti component converts pyroxene to perovskite(Gupta et al. 1973). MgCa-1 is the third linearly independent reaction (3). They combine to give: ½ Di + x Ca(g)→ x Mg(g)+ Pv + [Mg1/2-xSiTi4+-1Ca-1/2+x](2,3). Unlike (1), pyroxene is consumed in this reaction. The parameter x defines the extent of Mg-Ca exchange. When x > 0.5, WL rim formation occurs in an environment where Mg is volatile and Ca condenses. The reaction space defined by reactions (1) and (2,3) describes the transition from CAI interior to WL rims. WL rim pyroxene Ti contents, [CaTs], and Ca < 1 pfu are all explained in this space. The fourth linearly independent reaction is SiO(g):1/8 Di + ¼ Mg(g)→ ¾ SiO(g) + [Mg3/8Ca1/8Ti4+Ti3+-1Si-1/2](4). Silica reduction forms Ti4+, releasing SiO(g). (4) does not describe the oxidation of Ti3+ in WL rim pyroxene, but (1) - (4) results in En formation directly from the gas phase. This may explain WL rim analyses that have Si contents in excess

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Ca-Fe and Alkali-Halide Alteration of an Allende Type B CAI: Aqueous Alteration in Nebular or Asteroidal Settings

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Simon, S. B.; Grossman, L.

    2012-01-01

    Ca-Fe and alkali-halide alteration of CAIs is often attributed to aqueous alteration by fluids circulating on asteroidal parent bodies after the various chondritic components have been assembled, although debate continues about the roles of asteroidal vs. nebular modification processes [1-7]. Here we report de-tailed observations of alteration products in a large Type B2 CAI, TS4 from Allende, one of the oxidized subgroup of CV3s, and propose a speculative model for aqueous alteration of CAIs in a nebular setting. Ca-Fe alteration in this CAI consists predominantly of end-member hedenbergite, end-member andradite, and compositionally variable, magnesian high-Ca pyroxene. These phases are strongly concentrated in an unusual "nodule" enclosed within the interior of the CAI (Fig. 1). The Ca, Fe-rich nodule superficially resembles a clast that pre-dated and was engulfed by the CAI, but closer inspection shows that relic spinel grains are enclosed in the nodule, and corroded CAI primary phases interfinger with the Fe-rich phases at the nodule s margins. This CAI also contains abundant sodalite and nepheline (alkali-halide) alteration that occurs around the rims of the CAI, but also penetrates more deeply into the CAI. The two types of alteration (Ca-Fe and alkali-halide) are adjacent, and very fine-grained Fe-rich phases are associated with sodalite-rich regions. Both types of alteration appear to be replacive; if that is true, it would require substantial introduction of Fe, and transport of elements (Ti, Al and Mg) out of the nodule, and introduction of Na and Cl into alkali-halide rich zones. Parts of the CAI have been extensively metasomatized.

  20. Practical Use of Computationally Frugal Model Analysis Methods.

    Science.gov (United States)

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  1. Brain-Computer Interface: Common Tensor Discriminant Analysis Classifier Evaluation

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Bobrov, P.

    Piscataway: IEEE, 2011 - (Abraham, A.; Corchado, E.; Berwick, R.; de Carvalho, A.; Zomaya, A.; Yager, R.), s. 614-620 ISBN 978-1-4577-1122-0. [NaBIC 2011. World Congress on Nature and Biologically Inspired Computing /3./. Salamanca (ES), 19.10.2011-21.10.2011] R&D Projects: GA ČR GAP202/10/0262; GA ČR GA205/09/1079; GA MŠk(CZ) 1M0567 Grant ostatní: GA MŠk(CZ) ED1.1.00/02.0070 Institutional research plan: CEZ:AV0Z10300504 Keywords : human computer interface * motor imagery * EEG signal classification * Bayesian classification * Common Spatial Patterns * Common Tensor Discriminant Analysis Subject RIV: IN - Informatics, Computer Science

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  3. Calcium and Titanium Isotope Fractionation in CAIS: Tracers of Condensation and Inheritance in the Early Solar Protoplanetary Disk

    Science.gov (United States)

    Simon, J. I.; Jordan, M. K.; Tappa, M. J.; Kohl, I. E.; Young, E. D.

    2016-01-01

    The chemical and isotopic compositions of calcium-aluminum-rich inclusions (CAIs) can be used to understand the conditions present in the protoplantary disk where they formed. The isotopic compositions of these early-formed nebular materials are largely controlled by chemical volatility. The isotopic effects of evaporation/sublimation, which are well explained by both theory and experimental work, lead to enrichments of the heavy isotopes that are often exhibited by the moderately refractory elements Mg and Si. Less well understood are the isotopic effects of condensation, which limits our ability to determine whether a CAI is a primary condensate and/or retains any evidence of its primordial formation history.

  4. Emerging Trends and Statistical Analysis in Computational Modeling in Agriculture

    Directory of Open Access Journals (Sweden)

    Sunil Kumar

    2015-03-01

    Full Text Available In this paper the authors have tried to describe emerging trend in computational modelling used in the sphere of agriculture. Agricultural computational modelling with the use of intelligence techniques for computing the agricultural output by providing minimum input data to lessen the time through cutting down the multi locational field trials and also the labours and other inputs is getting momentum. Development of locally suitable integrated farming systems (IFS is the utmost need of the day, particularly in India where about 95% farms are under small and marginal holding size. Optimization of the size and number of the various enterprises to the desired IFS model for a particular set of agro-climate is essential components of the research to sustain the agricultural productivity for not only filling the stomach of the bourgeoning population of the country, but also to enhance the nutritional security and farms return for quality life. Review of literature pertaining to emerging trends in computational modelling applied in field of agriculture is done and described below for the purpose of understanding its trends mechanism behavior and its applications. Computational modelling is increasingly effective for designing and analysis of the system. Computa-tional modelling is an important tool to analyses the effect of different scenarios of climate and management options on the farming systems and its interaction among themselves. Further, authors have also highlighted the applications of computational modeling in integrated farming system, crops, weather, soil, climate, horticulture and statistical used in agriculture which can show the path to the agriculture researcher and rural farming community to replace some of the traditional techniques.

  5. Computational methods for viscoplastic dynamic fracture mechanics analysis

    International Nuclear Information System (INIS)

    The role of nonlinear rate-dependent effects in the interpretation of crack run-arrest events in ductile materials is being investigated by the Heavy-Section Steel Technology (HSST) program through development and applications of viscoplastic-dynamic finite element analysis techniques. This paper describes a portion of these studies wherein various viscoplastic constitutive models and several proposed nonlinear fracture criteria are being installed in general purpose (ADINA) and special purpose (VISCRK) finite element computer program. The constitutive models implemented in these computer programs include the Bodner-Parton and the Perzyna viscoplastic formulations; the proposed fracture criteria include three parameters that are based on energy principles. The predictive capabilities of the nonlinear techniques are evaluated through applications to a series of HSST wide-plate crack-arrest tests. To assess the impact of including viscoplastic effects in the computational models, values of fracture parameters calculated in elastodynamic and viscoplastic-dynamic analyses are compared for a large wide-plate test. Finally, plans are reviewed for additional computational and experimental studies to assess the utility of viscoplastic analysis techniques in constructing a dynamic inelastic fracture mechanics model for ductile steels. 34 refs., 14 figs

  6. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  7. Towards Advanced Data Analysis by Combining Soft Computing and Statistics

    CERN Document Server

    Gil, María; Sousa, João; Verleysen, Michel

    2013-01-01

    Soft computing, as an engineering science, and statistics, as a classical branch of mathematics, emphasize different aspects of data analysis. Soft computing focuses on obtaining working solutions quickly, accepting approximations and unconventional approaches. Its strength lies in its flexibility to create models that suit the needs arising in applications. In addition, it emphasizes the need for intuitive and interpretable models, which are tolerant to imprecision and uncertainty. Statistics is more rigorous and focuses on establishing objective conclusions based on experimental data by analyzing the possible situations and their (relative) likelihood. It emphasizes the need for mathematical methods and tools to assess solutions and guarantee performance. Combining the two fields enhances the robustness and generalizability of data analysis methods, while preserving the flexibility to solve real-world problems efficiently and intuitively.

  8. Computational methods for efficient structural reliability and reliability sensitivity analysis

    Science.gov (United States)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  9. Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation

    Science.gov (United States)

    Downey, W. T.; Hendrick, P. L.

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.

  10. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  11. Reactor safety analysis computer program features that enhance user productivity

    International Nuclear Information System (INIS)

    This paper describes several design features of the MARY computer program that increase user productivity. The MARY program was used to analyze behavior of the Savannah River Site (SRS) K Reactor during postulated nuclear and thermal-hydraulic transients, such as overpower and underflow events, before K Reactor was placed in cold standby in 1993. These analyses provide the bases for portions of the accident chapter of the K-Reactor Safety Analysis Report

  12. MIRAGE, a Computable General Equilibrium Model for Trade Policy Analysis

    OpenAIRE

    Jean, Sébastien; Guérin, Jean-Louis; DECREUX, Yvan; Bchir, Mohamed Hedi

    2002-01-01

    MIRAGE is a multi-region, multi-sector computable general equilibrium model, devoted to trade policy analysis. It incorporates imperfect competition, product differentiation by variety and by quality, and foreign direct investment, in a sequential dynamic set-up where installed capital is assumed to be immobile. Adjustment inertia is linked to capital stock reallocation and to market structure changes. MIRAGE draws upon a very detailed measure of trade barriers and of their evolution under gi...

  13. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  14. CAVASS: A Computer-Assisted Visualization and Analysis Software System

    OpenAIRE

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-01-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-so...

  15. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    OpenAIRE

    Pavithra Sampath; Gang Qian; Ellen Campana; Todd Ingalls; Jodi James; Stjepan Rajko; Jessica Mumford; Harvey Thornburg; Dilip Swaminathan; Bo Peng

    2009-01-01

    Laban movement analysis (LMA) is a systematic framework for describing all forms of human movement and has been widely applied across animation, biomedicine, dance, and kinesiology. LMA (especially Effort/Shape) emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world...

  16. Ontology-based metrics computation for business process analysis

    OpenAIRE

    Carlos Pedrinaci; John Domingue

    2009-01-01

    Business Process Management (BPM) aims to support the whole life-cycle necessary to deploy and maintain business processes in organisations. Crucial within the BPM lifecycle is the analysis of deployed processes. Analysing business processes requires computing metrics that can help determining the health of business activities and thus the whole enterprise. However, the degree of automation currently achieved cannot support the level of reactivity and adaptation demanded by businesses. In thi...

  17. Computational Methods for Failure Analysis and Life Prediction

    Science.gov (United States)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  18. Computer programs for the analysis of DTA and TG curves

    International Nuclear Information System (INIS)

    Two computer programs suitable for obtaining kinetic parameters from differential temperature vs. temperature (differential thermal analysis) as well as weight change vs. temperature (thermogravimetric) data have been written in FORTRAN. The programs allow the calculation of kinetic parameters using several methods by a least squares fit of the data to straight line. Criteria introduced to smoothen the numerically calculated differential thermal gravimetric (DTG) curves improve the results by the method of Freeman and Carroll. (auth.)

  19. Modeling and analysis of the spread of computer virus

    Science.gov (United States)

    Zhu, Qingyi; Yang, Xiaofan; Ren, Jianguo

    2012-12-01

    Based on a set of reasonable assumptions, we propose a novel dynamical model describing the spread of computer virus. Through qualitative analysis, we give a threshold and prove that (1) the infection-free equilibrium is globally asymptotically stable if the threshold is less than one, implying that the virus would eventually die out, and (2) the infection equilibrium is globally asymptotically stable if the threshold is greater than one. Two numerical examples are presented to demonstrate the analytical results.

  20. Variance analysis. Part II, The use of computers.

    Science.gov (United States)

    Finkler, S A

    1991-09-01

    This is the second in a two-part series on variance analysis. In the first article (JONA, July/August 1991), the author discussed flexible budgeting, including the calculation of price, quantity, volume, and acuity variances. In this second article, the author focuses on the use of computers by nurse managers to aid in the process of calculating, understanding, and justifying variances. PMID:1919788

  1. Computer-Aided Design Software for Torsional Analysis

    OpenAIRE

    Griffin, Timothy R.

    1998-01-01

    The goal of this research has been the development of an effective design tool for torsional analysis. In the hopes of achieving this goal the computer program, Torsion 1, has been created. This torsional transfer matrix program provides the user with the ability to easily model multi-rotor systems using a simple user-interface. The program is capable of modeling such components or system characteristics as continuously distributed mass, viscous and structural damping, vibratio...

  2. Triclosan Computational Conformational Chemistry Analysis for Antimicrobial Properties in Polymers

    OpenAIRE

    Petersen, Richard C.

    2015-01-01

    Triclosan is a diphenyl ether antimicrobial that has been analyzed by computational conformational chemistry for an understanding of Mechanomolecular Theory. Subsequent energy profile analysis combined with easily seen three-dimensional chemistry structure models for the nonpolar molecule Triclosan show how single bond rotations can alternate rapidly at a polar and nonpolar interface. Bond rotations for the center ether oxygen atom of the two aromatic rings then expose or hi...

  3. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    Science.gov (United States)

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community. PMID:16000014

  4. Spacelab data analysis using the space plasma computer analysis network (SCAN) system

    Science.gov (United States)

    Green, J. L.

    1984-01-01

    The Space-plasma Computer Analysis Network (SCAN) currently connects a large number of U.S. Spacelab investigators into a common computer network. Used primarily by plasma physics researchers at present, SCAN provides access to Spacelab investigators in other areas of space science, to Spacelab and non-Spacelab correlative data bases, and to large Class VI computational facilities for modeling. SCAN links computers together at remote institutions used by space researchers, utilizing commercially available software for computer-to-computer communications. Started by the NASA's Office of Space Science in mid 1980, SCAN presently contains ten system nodes located at major universities and space research laboratories, with fourteen new nodes projected for the near future. The Stanford University computer gateways allow SCAN users to connect onto the ARPANET and TELENET overseas networks.

  5. Perfecting of a computer program for PIXE quantitative analysis

    International Nuclear Information System (INIS)

    PIXE technique is used to measure element abundance in geological, archaeological samples, in medicine ...In this thesis, the author recalls the theoretical bases of this analysis method, gives the calculation method of element ratios that he has used for thick samples, describes the computer programs (XMONO and PIXCO) and then verifies experimentally his results that he has obtained with the PIXCO program results. The comparative studies show that PIXCO program gives good results for the elements of 20< Z<40 but ignores X secondary emission. PIXCO program is interesting for the analysis of next samples in series

  6. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  7. Verification and uncertainty analysis of fuel codes using distributed computing

    International Nuclear Information System (INIS)

    Of late, nuclear safety analysis computer codes have been held to increasingly high standards of quality assurance. As well, best estimate with uncertainty analysis is taking a more prominent role, displacing to some extent the idea of a limit consequence analysis. In turn, these activities have placed ever-increasing burdens on available computing resources. A recent project at Ontario Hydro has been the development of the capability of using the workstations on our Windows NT LAN as a distributed batch queue. The application developed is called SheepDog. This paper reports on the challenges and opportunities met in this project, as well as the experience gained in applying this method to verification and uncertainty analysis of fuel codes. SheepDog has been applied to performing uncertainty analysis, in a basically CSAU like method, of fuel behaviour during postulated accident scenarios at a nuclear power station. For each scenario, several hundred cases were selected according to a Latin Hypercube scheme, and used to construct a response surface surrogate for the codes. Residual disparities between code predictions and response surfaces led to the suspicion that there were discontinuities in the predictions of the analysis codes. This led to the development of 'stress testing' procedures. This refers to two procedures: coarsely scanning through several input parameters in combination, and finely scanning individual input parameters. For either procedure, the number of code runs required is several hundred. In order to be able to perform stress testing in a reasonable time, SheepDog was applied. The results are examined for such considerations as continuity, smoothness, and physical reasonableness of trends and interactions. In several cases, this analysis uncovered previously unknown errors in analysis codes, and allowed pinpointing the part of the codes that needed to be modified. The challenges involved include the following: the usual choices of development

  8. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  9. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  10. High performance remote sensing data analysis using parallel computation

    Science.gov (United States)

    Patterson, Jean E.; Ferraro, Robert D.; Sparks, Lawrence

    1990-01-01

    This paper examines the JPL/Caltech parallel processing system designed for rapid processing and transfer of large quantities of data from remote sensing instruments flown on NASA missions. Two remote sensing analysis applications that use this processing system are described: (1) an analysis system for retrieval of atmospheric parameters (such as species abundance, atmospheric temperature, and water vapor profiles) from data obtained by a Fourier transform IR spectrometer and (2) a prototype airborne SAR processing system. It is shown that a parallel processing system such as the JPL/Caltech system can offer supercomputer computational capability and high-volume data throughput and still be cost-effective.

  11. Shielding analysis methods available in the scale computational system

    International Nuclear Information System (INIS)

    Computational tools have been included in the SCALE system to allow shielding analysis to be performed using both discrete-ordinates and Monte Carlo techniques. One-dimensional discrete ordinates analyses are performed with the XSDRNPM-S module, and point dose rates outside the shield are calculated with the XSDOSE module. Multidimensional analyses are performed with the MORSE-SGC/S Monte Carlo module. This paper will review the above modules and the four Shielding Analysis Sequences (SAS) developed for the SCALE system. 7 refs., 8 figs

  12. Developing CAI project with multimedia control in Visual Basic about image%用Visual Basic软件开发多媒体CAI课件——图像篇

    Institute of Scientific and Technical Information of China (English)

    曲双为

    2001-01-01

    Some skills that how to treat picture for multimedia Computer Aided Introduction (CAI) are introduced in this paper. This skills are based on Microsoft Visual Basic 6.0 which is a popular programming tool used by many programmers at present.%从编程的角度出发,介绍了以Microsoft Visual Basic 6.0为开发平台,在开发多媒体CAI课件的过程中对图像的部分处理技巧.

  13. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  14. Artificial Intelligence and Computer Assisted Instruction. CITE Report No. 4.

    Science.gov (United States)

    Elsom-Cook, Mark

    The purpose of the paper is to outline some of the major ways in which artificial intelligence research and techniques can affect usage of computers in an educational environment. The role of artificial intelligence is defined, and the difference between Computer Aided Instruction (CAI) and Intelligent Computer Aided Instruction (ICAI) is…

  15. Implementation of Computer Based Education by a Small College.

    Science.gov (United States)

    Stemmer, Paul M., Jr.; And Others

    Both challenges and successes resulted from the implementation of computer-based or computer-assisted instruction (CAI) at Mercy College of Detroit through a program funded with a Comprehensive Assistance to Undergraduate Science Education (CAUSE) grant. Four influences were found most significant in faculty-related challenges: computer phobia,…

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  19. Computer analysis of radiocardiograms of patients with intracardiac shunts

    International Nuclear Information System (INIS)

    In 112 patients catheterized for congenital heart disease, radiocardiographic data have been collected by right atrial injection of sup(113m)In. Four scintillation detectors were used to monitor the activity changes in the heart, lungs and head. The heart measurements were done with both frontal and left lateral detectors, the latter being constructed for better collimation and efficiency. The CAMAC system and PDP-9 computer served for data collection, storage analysis, and display. Data were collected at 10 or 20 points per second and the pulsatile output of heart curves recorded. Since radiocardiograms in patients with shunts are complex, a computer program (PULSE) incorporating a pulsatile heart model was written for radiocardiogram simulation and analysis. The model consists of a series of compartments pulsating alternately. From each compartment a certain fraction of indicator is ejected into the next compartment and added to the indicator retained there. Each cardiac chamber is represented by one compartment, while pulmonary and systemic circulation are represented by a series of paired number of compartments. For shunt simulation a certain fraction can be ejected backward or forward along the series. In order to shorten the computation the model was fitted only to the end-diastolic and end-systolic states of the compartments. In left-to-right haemodynamically significant shunts the heart and lung curves show recirculation peaks. In right-to-left shunts the head curve detects the early arrival of indicator and shows a biphasic character. The addition of a computer for data acquisition and for pulsatile model simulation and analysis makes it possible to estimate shunt flows. Estimates correlate well with results of standard shunt measurements. (author)

  20. A second generation finite element computer program for stress analysis

    International Nuclear Information System (INIS)

    A second generation finite element computer program for stress analysis is under development. Incorporated in the computer program are finite elements which satisfy the completeness and continuity requirements for arbitrary order polynomial approximating functions. The distinguishing feature of the new algorithm is that it permits the user to exercise control over both the number of finite elements and the order of approximation over each element. Consequently, it is not necessary to define more finite elements than needed to specify the geometry of a structure. The order of polynomial approximating functions may be chosen either directly or indirectly; by specifying the required level of precision in terms of the quantities of interest. An automated iterative process then seeks the degree of approximation which corresponds to the specified level of precision. An important advantage of the new algorithm is that it substantially increases the computational power of the finite element method. Comparisons with state-of-the-art computer programs indicated significant reductions in the number of finite elements needed and the number of variables employed. The reduction in the number of finite elements was by at least an order of magnitude in all cases. The new finite element stress analysis capability is, of course, applicable to all problems which can be solved by current finite element methods. The potential benefits are the greatest in applications where, due to the presence of steep stress gradients, mechanical fatigue controls design and in dynamic and non-linear analyses where the number of variables must be kept to a minimum in order to make numerical analysis feasible

  1. On native Danish learners' challenges in distinguishing /tai/, /cai/ and /zai/

    DEFF Research Database (Denmark)

    Sloos, Marjoleine; Zhang, Chun

    2015-01-01

    With a growing interest in learning Chinese globally, there is a growing interest for phonologists and language instructors to understand how nonnative Chinese learners perceive the Chinese sound inventory. We experimentally investigated the Danish (L 1) speaker’s perception of three Mandarin...... results show that beginner learners perform on chance level regarding the distinction between t and z and between c and z. The reason is that in Danish, which has an aspiration contrast between plosives (like Chinese) /th/ is variably pronounced as affricated /ts/ and many speakers are unaware of this...... optional variation. This inhibits the distinction between Chinese t and z (pronounced as /th ts/). Further, Danish has no affricates, which makes the distinction between different affricates based on the aspiration contrast (like cai-zai) particularly difficult....

  2. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  3. From Corporate Social Responsibility, through Entrepreneurial Orientation, to Knowledge Sharing: A Study in Cai Luong (Renovated Theatre) Theatre Companies

    Science.gov (United States)

    Tuan, Luu Trong

    2015-01-01

    Purpose: This paper aims to examine the role of antecedents such as corporate social responsibility (CSR) and entrepreneurial orientation in the chain effect to knowledge sharing among members of Cai Luong theatre companies in the Vietnamese context. Knowledge sharing contributes to the depth of the knowledge pool of both the individuals and the…

  4. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil

    Directory of Open Access Journals (Sweden)

    Alves Rômulo RN

    2009-11-01

    Full Text Available Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers, with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 native residents. Data were checked through systematic observations and collection of zoological material. Results The dependence on the terrestrial fauna by Caiçaras is especially for food and medicine. The main species used are Didelphis spp., Dasyprocta azarae, Dasypus novemcinctus, and small birds (several species of Turdidae. Contrasting with a high dependency on terrestrial fauna resources by native Amazonians, the Caiçaras do not show a constant dependency on these resources. Nevertheless, the occasional hunting of native animals represents a complimentary source of animal protein. Conclusion Indigenous or local knowledge on native resources is important in order to promote local development in a sustainable way, and can help to conserve biodiversity, particularly if the resource is sporadically used and not commercially exploited.

  5. Dietary Changes over Time in a Caiçara Community from the Brazilian Atlantic Forest

    Directory of Open Access Journals (Sweden)

    Priscila L. MacCord

    2006-12-01

    Full Text Available Because they are occurring at an accelerated pace, changes in the livelihoods of local coastal communities, including nutritional aspects, have been a subject of interest in human ecology. The aim of this study is to explore the dietary changes, particularly in the consumption of animal protein, that have taken place in Puruba Beach, a rural community of caiçaras on the São Paulo Coast, Brazil, over the 10-yr period from 1992–1993 to 2002–2003. Data were collected during six months in 1992–1993 and during the same months in 2002–2003 using the 24-hr recall method. We found an increasing dependence on external products in the most recent period, along with a reduction in fish consumption and in the number of fish species eaten. These changes, possibly associated with other nonmeasured factors such as overfishing and unplanned tourism, may cause food delocalization and a reduction in the use of natural resources. Although the consequences for conservation efforts in the Atlantic Forest and the survival of the caiçaras must still be evaluated, these local inhabitants may be finding a way to reconcile both the old and the new dietary patterns by keeping their houses in the community while looking for sources of income other than natural resources. The prospect shown here may reveal facets that can influence the maintenance of this and other communities undergoing similar processes by, for example, shedding some light on the ecological and economical processes that may occur within their environment and in turn affect the conservation of the resources upon which the local inhabitants depend.

  6. ENIGMA, CAI-CMI for Introductory Logic: Some of Its Abilities with English Sentences.

    Science.gov (United States)

    Laymon, Ronald

    The philosophy department of the Ohio State University began development of a computer-tutorial program, called ENIGMA, in 1972. The aim of the course was to help students to use various logical tools in the analysis of everyday arguments by giving drill-and-practice sessions, testing, and grading examinations. Part of ENIGMA is the propositional…

  7. PWRDYN: a computer code for PWR plant dynamic analysis

    International Nuclear Information System (INIS)

    This report describes analytical models and calculated results of a PWR plant dynamic analysis code PWRDYN. The code has been developed in order to analyze and evaluate transient responses for small disturbance such as operating mode change and control system characteristic analysis. The features included in PWRDYN are 1) One loop approximation of primary loops, 2) Praimary coolant is always subcooled, 3) At the secondary side of steam generator is used one dimensional model and natural circulation is calculated assuming constant by positive driving head. 4) Main control systems are incorporated. In the transient responses caused by small perturbation, the calculated results by PWRDYN are in good agreement with the RETRAN calculations. Furthermore, computing time is very short so as about one seventh of real time, hence the code is convenient and useful for dynamic analysis of PWR plants. (author)

  8. COMPUTER SIMULATION: COMPARATIVE ANALYSIS OF SOFTWARES ARENA® AND PROMODEL®

    Directory of Open Access Journals (Sweden)

    Luiz Enéias Zanetti Cardoso

    2016-04-01

    Full Text Available The computer simulation is not exclusive areas of Logistics and Production, implementation takes place within the limits of technical expertise of professionals. Although not widespread at present, there is a rise of projection in use, as the numerous application possibilities, if properly modeled in reality presented face. This article proposes to present comparative and qualitative analysis of two computer simulation software, version Arena® 14,000 Student and ProModel® RunTimeSilve version - Demo, according to the following criteria: desktop, access to commands, ease in developing the software model and accessories, and can be seen the main features of each simulation software, as well as the differences between their interfaces, however, both were confirmed as great tools to support management processes.

  9. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  10. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  11. SCRIMP. A thermal-hydraulic subchannel analysis computer code

    International Nuclear Information System (INIS)

    SCRIMP is a FORTRAN IV computer code for calculating pressure drop, flow rates, heat transfer rates and temperatures in heat exchangers such as fuel elements of typical gas cooled nuclear reactors, under steady state conditions. The subchannel analysis computer code SCRIMP is an improved version of the SCEPTIC program. The most important modification is the introduction of a new subroutine FASTCAL for the friction factor and heat transfer coefficient calculations. The different boundary conditions of the subchannels such as geometry changes, quality of surfaces, heat flux variation and unheated wall are considered in each particular case by using this subroutine. Due to its great flexibility, particularly with respect to geometrical arrangement, and the relatively short calculation time, SCRIMP is a very useful tool to analyze a variety of thermohydraulic problems. (Auth.)

  12. A program to validate computer codes for container impact analysis

    International Nuclear Information System (INIS)

    The detailed analysis of containers during impacts to assess either margins to failure or the consequences of different design strategies, requires the use of sophisticated computer codes to model the interactions of the various structural components. The combination of plastic deformation, impact and sliding at interfaces and dynamic loading effects provides a severe test of both the skill of the analyst and the robustness of the computer codes. A program of experiments has been under way at Winfrith since 1987 using extensively instrumented models to provide data for the validation of such codes. Three finite element codes, DYNA3D, HONDO-II and ABAQUS, were selected as suitable tools to cover the range of conditions expected in typical impacts. The impact orientation, velocity and instrumentation locations for the experiments are specified by pre-test calculations using these codes. Post-test analyses using the actual impact orientation and velocities are carried out as necessary if significant discrepancies are found

  13. Computer code for thermal hydraulic analysis of light water reactors

    International Nuclear Information System (INIS)

    A computer programme (THAL) has been developed to perform thermal hydraulic analysis of a single channel in a light water moderated core. In this code the hydrodynamic and thermodynamic equations describing one-dimensional axial flow have been discretized and solved explicitly stepwise up the coolant channel for an arbitrary power profile. THAL has been developed for use on small computers and it is capable of predicting the coolant, clad and fuel temperature profiles, steam quality, void fraction, pressure drop, critical heat flux and DNB ratio throughout the core. A boiling water reactor and a pressurized water reactor have been analyzed as test cases. The results obtained through the use of THAL compare favourably with those given in the design reports of these reactor systems. (author)

  14. Computational analysis of microRNA function in heart development

    Institute of Scientific and Technical Information of China (English)

    Ganqiang Liu; Min Ding; Jiajia Chen; Jin yan Huang; Haiyun Wang; Qing Jing; Bairong Shen

    2010-01-01

     Emerging evidence suggests that specific spatio-temporal microRNA(miRNA)expression is required for heartdevelopment.In recent years,hundreds of miRNAs have been discovered.In contrast,functional annotations areavailable only for a very small fraction of these regulatory molecules.In order to provide a global perspective for the biologists who study the relationship between differen tially expressed miRNAs and heart development,we employed computational analysis to uncover the specific cellular processes and biological pathways targeted by miRNAs in mouse heart development.Here,we utilized Gene Ontology(GO)categories,KEGG Pathway,and GeneGo Pathway Maps as a gene functional annotation system for miRNA target enrichment analysis.The target genes of miRNAs were found to be enriched in functional categories and pathway maps in which miRNAs couldplay important roles during heart development.Meanwhile,we developed miRHrt(http://sysbio.suda.edu.cn/mirhrt/),a database aiming to provide a comprehen sive resource of miRNA function in regulating heart development.These computational analysis results effec tively illustrated the correlation of differentially expressed miRNAs with cellular functions and heart development.We hope that the identified novel heart developmentassociated pathways and the database presented here would facilitate further understanding of the roles and mechanisms of miRNAs in heart development.

  15. Analysis and computational dissection of molecular signature multiplicity.

    Directory of Open Access Journals (Sweden)

    Alexander Statnikov

    2010-05-01

    Full Text Available Molecular signatures are computational or mathematical models created to diagnose disease and other phenotypes and to predict clinical outcomes and response to treatment. It is widely recognized that molecular signatures constitute one of the most important translational and basic science developments enabled by recent high-throughput molecular assays. A perplexing phenomenon that characterizes high-throughput data analysis is the ubiquitous multiplicity of molecular signatures. Multiplicity is a special form of data analysis instability in which different analysis methods used on the same data, or different samples from the same population lead to different but apparently maximally predictive signatures. This phenomenon has far-reaching implications for biological discovery and development of next generation patient diagnostics and personalized treatments. Currently the causes and interpretation of signature multiplicity are unknown, and several, often contradictory, conjectures have been made to explain it. We present a formal characterization of signature multiplicity and a new efficient algorithm that offers theoretical guarantees for extracting the set of maximally predictive and non-redundant signatures independent of distribution. The new algorithm identifies exactly the set of optimal signatures in controlled experiments and yields signatures with significantly better predictivity and reproducibility than previous algorithms in human microarray gene expression datasets. Our results shed light on the causes of signature multiplicity, provide computational tools for studying it empirically and introduce a framework for in silico bioequivalence of this important new class of diagnostic and personalized medicine modalities.

  16. Computer-assisted morphometric analysis of renal radiation response

    International Nuclear Information System (INIS)

    A single x-ray dose of 1,200 to 1,600 rads to the mouse kidney is associated with definite morphologic alteration but minimal functional impairment at six months; this progresses to profound structural and functional impairment by one year after irradiation. Subjective morphologic assessment of renal damage at six months correlates well with total radiation dose, fractionation schedule and energy characteristics of the radiation beam but does not provide adequate quantitative numerical data for sophisticated statistical tests of significance or for comparisons of effect variability at given dose levels. This investigation assessed the applicability of computer-assisted morphometric analysis (CAMA) for quantitation of effects and in making statistical comparisons of significance between kidneys subjectively classified as to degree of histologic alterations. Images of renal cortex tubular nuclei from the various histologic grades were digitized, recorded and analyzed with the CAMA system. Results indicate that the reliability of specific grade assignment by CAMA for individual nuclei was inadequate but that separation of irradiated and unirradiated renal tissue (bivariate group means) was quite distinct and of high reliability. Differences were present among the four irradiated histologic grades, but they were not marked, especially among the three highest grades. More accurate quantitation of nuclear size variations was achieved, and chromatin textural differences were detected that were not apparent to the eye. Computer-assisted morphometric analysis appears to have a valuable application in the quantification and analysis of chronic radiation effects

  17. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    The education and training system for Accident Management was developed by the Japanese BWR group and Hitachi Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the education and training system with computer simulations. Both systems are designed to be executed on personal computers. The outlines of the CAI education system and the education and training system with simulator are reported below. These systems provides plant operators and technical support center staff with the effective education and training for accident management. (author)

  18. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  19. Equivalent Bar Conceptions for Computer Analysis of Pantographic Foldable Structures

    Institute of Scientific and Technical Information of China (English)

    陈务军; 付功义; 何艳丽; 董石麟

    2003-01-01

    An equivalent bar conception is firstly developed for the computer analysis of pantographic foldablestructures. The uniplet of two three-node beam elements is assumed as a six-bar assembly with respect to leastnorm least square solution for the elastic strain energy equality. The equilibrium equation is developed for the e-quivalent models, and the internal forces formulated sequently for backup calculation. This procedure is provedpractical for some engineering, and some interesting concepts proposed. Finally, three numerical tests are present-ed.

  20. Meshing analysis of toroidal drive by computer algebra system

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Presents the meshing analysis based on the Computer Algebra System to make it easier to deduce complex formulas while the expression of more complicated surface equations are visualized, by which, the contact line, mesh ing bordlines and undercut bordlines of toroidal drive are deduced, and the results obtained are consistent with the re sults discussed in literature[1] , and concludes that the absolute value of the induced normal curvature is usually smaller (less than 0.12, for example), and it increases as parameters ψ2, V and R increase, decreases as parameter r in creases, and hardly varies with W2, and the variation with a, i21 is not definite.

  1. Introduction to Numerical Computation - analysis and Matlab illustrations

    DEFF Research Database (Denmark)

    Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun

    In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about their...... properties. The book describes and analyses numerical methods for error analysis, differentiation, integration, interpolation and approximation, and the solution of nonlinear equations, linear systems of algebraic equations and systems of ordinary differential equations. Principles and algorithms are...

  2. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  3. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  4. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    Science.gov (United States)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  5. Computer analysis of general linear networks using digraphs.

    Science.gov (United States)

    Mcclenahan, J. O.; Chan, S.-P.

    1972-01-01

    Investigation of the application of digraphs in analyzing general electronic networks, and development of a computer program based on a particular digraph method developed by Chen. The Chen digraph method is a topological method for solution of networks and serves as a shortcut when hand calculations are required. The advantage offered by this method of analysis is that the results are in symbolic form. It is limited, however, by the size of network that may be handled. Usually hand calculations become too tedious for networks larger than about five nodes, depending on how many elements the network contains. Direct determinant expansion for a five-node network is a very tedious process also.

  6. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. A computational analysis of prehistoric lines: geometric engravings and language

    Directory of Open Access Journals (Sweden)

    Víctor Manuel LONGA

    2013-06-01

    Full Text Available Paleoanthropology and Archaeology have usually analyzed prehistoric remains from the perspective of the behavior those remains could be associated with –symbolic, technological, social, etc. As regards language, symbolic objects of the archaeological record have been considered to automatically indicate the existence of complex language in Prehistory. This paper brings a very different approach to the fore: to consider prehistoric remains from the perspective of the mental computational processes and capabilities required for their production. This approach is not concerned with the ‘semantics’ of the pieces –i.e. their alleged symbolic or representational nature–, but it is interested in the analysis of purely formal features revealing a language-like computational complexity. Starting from such a view, the paper analyzes (1 geometric designs from the Eurasian Middle and Lower Palaeolithic made by species like Homo neanderthalensis and perhaps Homo heidelbergensis, and (2 geometric designs from the African Middle Stone Age, made by Anatomically Modern Humans. The computational comparison between both types of designs makes it possible to infer the kind of language those species were endowed with.

  9. G-computation demonstration in causal mediation analysis.

    Science.gov (United States)

    Wang, Aolin; Arah, Onyebuchi A

    2015-10-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings. PMID:26537707

  10. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  11. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. PMID:20566337

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  13. Analysis of CERN computing infrastructure and monitoring data

    Science.gov (United States)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  14. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  15. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  16. Computer-aided strength analysis of the modernized freight wagon

    Science.gov (United States)

    Płaczek, M.; Wróbel, A.; Baier, A.

    2015-11-01

    In the paper results of computer-aided strength analysis of the modernized freight wagon based on Finite Element Method are presented. CAD model of the considered freight wagon was created and its strength was analysed in agreement with norms described the way of such kind of freight wagons testing. Then, the model of the analysed freight wagon was modernized by adding composite panels covering the inner surface of the vehicle body. Strength analysis was carried out once again and obtained results were juxtaposed. This work was carried out in order to verify the influence of composite panels on the strength of the freight car body and to estimate the possibility of reducing the steel shell thickness of the box in order to reduce weight of the freight wagon.

  17. Chest x-ray analysis by computer: final technical report

    International Nuclear Information System (INIS)

    The purpose of this study was to evaluate and demonstrate the feasibility of the automated analysis of chest x-rays for the classification of pneumoconiosis films according to the U.I.C.C./Cincinnati standard films. Toward this end, computer programs simulating the proposed systems were prepared. Using these programs, the authors then examined three sets of chest radiographs to determine the extent of pneumoconiosis present. The results of the examinations of these x-rays clearly indicated the feasibility of the proposed system. Based on the outcome of these examinations, a complete set of hardware and software specifications were established for a system which can be used for the large scale automatic analysis of chest x-rays

  18. Modern wing flutter analysis by computational fluid dynamics methods

    Science.gov (United States)

    Cunningham, Herbert J.; Batina, John T.; Bennett, Robert M.

    1988-01-01

    The application and assessment of the recently developed CAP-TSD transonic small-disturbance code for flutter prediction is described. The CAP-TSD code has been developed for aeroelastic analysis of complete aircraft configurations and was previously applied to the calculation of steady and unsteady pressures with favorable results. Generalized aerodynamic forces and flutter characteristics are calculated and compared with linear theory results and with experimental data for a 45 deg sweptback wing. These results are in good agreement with the experimental flutter data which is the first step toward validating CAP-TSD for general transonic aeroelastic applications. The paper presents these results and comparisons along with general remarks regarding modern wing flutter analysis by computational fluid dynamics methods.

  19. Analysis of Craniofacial Images using Computational Atlases and Deformation Fields

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur

    2008-01-01

    The topic of this thesis is automatic analysis of craniofacial images. The methods proposed and applied contribute to the scientific knowledge about different craniofacial anomalies, in addition to providing tools for detailed and robust analysis of craniofacial images for clinical and research...... purposes. The basis for most of the applications is non-rigid image registration. This approach brings one image into the coordinate system of another resulting in a deformation field describing the anatomical correspondence between the two images. A computational atlas representing the average anatomy of...... findings about the craniofacial morphology and asymmetry of Crouzon mice. Moreover, a method to plan and evaluate treatment of children with deformational plagiocephaly, based on asymmetry assessment, is established. Finally, asymmetry in children with unicoronal synostosis is automatically assessed...

  20. A computer program (MACPUMP) for interactive aquifer-test analysis

    Science.gov (United States)

    Day-Lewis, F. D.; Person, M.A.; Konikow, L.F.

    1995-01-01

    This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.

  1. 计算机辅助创新驱动的产品概念设计创新设想产生过程模型%Process model of new ideas generation for product conceptual design driven by CAI

    Institute of Scientific and Technical Information of China (English)

    张建辉; 檀润华; 张鹏; 曹国忠

    2013-01-01

    Generation of creative ideas was critical in the product conceptual design process. The obstacle to idea generation in the process was that desigers didn't make full use of knowledge in different fields. Theory of Inventive Problem Solving (TRIZ) based Computer-aided Innovation Systems (CAIs) provided a platform for knowledge application in different fields. Principles of creative idea generation driven by Computer-aided Innovation (CAI) was put forward, the inventive problem was solved based on the design scenario of CAIs, the extended solution space was set up by (Unexpected Discoveries)UXD which was implied in the source design in order to drive the generation of creative idea, then, an integrated process model of creative idea generation for product conceptual design driven by CAI was developed. Idea generation for a safety isolation butterfly valve was carried out using the process model and demonstrated its feasibility.%创新设想产生是产品概念设计阶段的关键环节,影响该阶段设想产生的障碍是设计人员不能很好地利用多学科领域的知识.鉴于此,基于发明问题解决理论的计算机辅助创新软件系统提供了应用多学科领域知识的一个平台.提出了计算机辅助创新驱动的创新设想产生原理,以计算辅助创新软件为设计场景进行问题求解,基于源设计中的未预见的发现建立扩展解空间,驱动创新设想产生,进而建立了计算机辅助创新驱动的产品概念设计创新设想产生过程模型.通过安全隔离蝶阀创新设想产生验证了该模型的可行性.

  2. Analysis on Phase Transformation (ATP) Using Computational Thermal Principles (CTP)

    Institute of Scientific and Technical Information of China (English)

    N.Alagurmurthi; K.Palaniradja; V. Soundararajan

    2004-01-01

    Computer analysis based on computational thermal principles to predict the transformation kinetics in steels at varying temperatures is of great practical importance in different areas of heat treatment. As a result, using the theory of transient state heat conduction with convective boundary conditions, an efficient program named "ATP" (Analysis on Phase Transformation) has been developed to determine the temperature distribution under different quenching conditions for different geometries such as plate, cylinder and sphere. In addition to these the microstructures and the corresponding hardness developed during quenching are predicted using Time Temperature Transformation (TTT) diagram incorporated in the analysis. To approve our work, dilation curves, Heisler charts and time-temperature history curve have been generated. This paper deals with basic objective of the program (ATP) determination of temperature, microstructure and hardness distribution and also includes an online prediction of austenite-pearlite and austenite-martensite transformation in steels along with the corresponding retained fractions. The quenching of a cylinder in gases, liquids and liquid metals is analyzed to show the non-liner effect of cylinder diameter on the temperature and microstructures. Further in the program we have considered a typical 1080 steel cylinders quenched in water for predicting and comparing the program results with experimental values and can be extended even to other grades of steels. The numerical results of program are found to be in good agreement with the experimental data obtained. Finally the quenching process analysis described in the study appears to be a promising tool for the design of heat-treatment process parameters for steels.

  3. The CDF Computing and Analysis System:First Experience

    Institute of Scientific and Technical Information of China (English)

    RickColombo; FedorRatnikov; 等

    2001-01-01

    The Collider Detector at Fermilab(CDF) collaboration records and analyses proton anti-proton interactions with a center-of -mass energy of 2 TeV at the Tevatron,A new collider run,Run II,of the Tevatron started in April.During its more than two year duration the CDF experiment expects to record about 1 PetaByte of data.With its multi-purpose detector and center-of mass energy at the frontier,the experimental program is large and versatile.The over 500 scientists of CDF will engage in searches for new particles,like the Higgs boson or supersymmetric particles,precision measurement of electroweak parameters,like the mass of the W boson,measurement of top quark parameters and a large spectrum of B physics.The experiment has taken data and analysed them in previous runs.For Run II,however,the computing model was changed to incorporate new methodologies.the file format switched.and both data handling and analysis system redesigned to cope with the increased demands.This paper(4-036 at Chep 2001)gives an overview of the CDF Run II compute system with emphasize on areas where the current system does not match initial estimates and projections.For the data handling and analysis system a more detailed description is given.

  4. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    Directory of Open Access Journals (Sweden)

    Jayashree Sadasivam

    2012-08-01

    Full Text Available Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors. They are important signalling proteins involved in both vasculogenesis and angiogenesis, Over expression of VEGF can cause vascular disease in the retina of the eye and other parts of the body. Drugs can inhibit VEGF and control or slowdown those disease. Computational analysis of VEGF with other  genes responsible for diabetic retinopathy were done  by aligning those genes by pair wise and multiple sequence alignments. MSA shows VEGF’s role in diabetic retinopathy and its  related with other genes and proteins responsible for pathogenesis of diabetic retinopathy. Also  the determination of the promoter and conserved domain of  VEGF gene  help us to identify  its expression levels. Thus molecular docking studies were carried out  to analyse the biomarker VEGF, that helps in treatment of diabetic retinopathy which is proliferative in nature due to uncontrolled angiogenesis. Normal 0 false false false EN-US X-NONE X-NONE

  5. Smoothing spline analysis of variance approach for global sensitivity analysis of computer codes

    International Nuclear Information System (INIS)

    The paper investigates a nonparametric regression method based on smoothing spline analysis of variance (ANOVA) approach to address the problem of global sensitivity analysis (GSA) of complex and computationally demanding computer codes. The two steps algorithm of this method involves an estimation procedure and a variable selection. The latter can become computationally demanding when dealing with high dimensional problems. Thus, we proposed a new algorithm based on Landweber iterations. Using the fact that the considered regression method is based on ANOVA decomposition, we introduced a new direct method for computing sensitivity indices. Numerical tests performed on several analytical examples and on an application from petroleum reservoir engineering showed that the method gives competitive results compared to a more standard Gaussian process approach

  6. VARIABLE AND EXTREME IRRADIATION CONDITIONS IN THE EARLY SOLAR SYSTEM INFERRED FROM THE INITIAL ABUNDANCE OF {sup 10}Be IN ISHEYEVO CAIs

    Energy Technology Data Exchange (ETDEWEB)

    Gounelle, Matthieu [Laboratoire de Mineralogie et de Cosmochimie du Museum, CNRS and Museum National d' Histoire Naturelle, UMR 7202, CP52, 57 rue Cuvier, F-75005 Paris (France); Chaussidon, Marc; Rollion-Bard, Claire, E-mail: gounelle@mnhn.fr [Centre de Recherches Petrographiques et Geochimiques, CRPG-CNRS, BP 20, F-54501 Vandoeuvre-les-Nancy Cedex (France)

    2013-02-01

    A search for short-lived {sup 10}Be in 21 calcium-aluminum-rich inclusions (CAIs) from Isheyevo, a rare CB/CH chondrite, showed that only 5 CAIs had {sup 10}B/{sup 11}B ratios higher than chondritic correlating with the elemental ratio {sup 9}Be/{sup 11}B, suggestive of in situ decay of this key short-lived radionuclide. The initial ({sup 10}Be/{sup 9}Be){sub 0} ratios vary between {approx}10{sup -3} and {approx}10{sup -2} for CAI 411. The initial ratio of CAI 411 is one order of magnitude higher than the highest ratio found in CV3 CAIs, suggesting that the more likely origin of CAI 411 {sup 10}Be is early solar system irradiation. The low ({sup 26}Al/{sup 27}Al){sub 0} [{<=} 8.9 Multiplication-Sign 10{sup -7}] with which CAI 411 formed indicates that it was exposed to gradual flares with a proton fluence of a few 10{sup 19} protons cm{sup -2}, during the earliest phases of the solar system, possibly the infrared class 0. The irradiation conditions for other CAIs are less well constrained, with calculated fluences ranging between a few 10{sup 19} and 10{sup 20} protons cm{sup -2}. The variable and extreme value of the initial {sup 10}Be/{sup 9}Be ratios in carbonaceous chondrite CAIs is the reflection of the variable and extreme magnetic activity in young stars observed in the X-ray domain.

  7. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  8. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  9. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    Science.gov (United States)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  10. The future of computer-aided sperm analysis

    Science.gov (United States)

    Mortimer, Sharon T; van der Horst, Gerhard; Mortimer, David

    2015-01-01

    Computer-aided sperm analysis (CASA) technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6) and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards. PMID:25926614

  11. The future of computer-aided sperm analysis

    Directory of Open Access Journals (Sweden)

    Sharon T Mortimer

    2015-01-01

    Full Text Available Computer-aided sperm analysis (CASA technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6 and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards.

  12. Analysis of Job Scheduling Algorithms in Cloud Computing

    OpenAIRE

    Rajveer Kaur; Supriya Kinger

    2014-01-01

    Cloud computing is flourishing day by day and it will continue in developing phase until computers and internet era is in existence. While dealing with cloud computing, a number of issues are confronted like heavy load or traffic while computation. Job scheduling is one of the answers to these issues. It is the process of mapping task to available resource. In section (1) discuss about cloud computing and scheduling. In section (2) explain about job scheduling in cloud computing. In section (...

  13. Markov analysis of different standby computer based systems

    International Nuclear Information System (INIS)

    As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR

  14. Computer Pilot Program, 1986-87. OEA Evaluation Report.

    Science.gov (United States)

    Guerrero, Frank; Swan, Karen

    The Computer Pilot Program that was implemented in 19 New York City schools in 1986-87 was designed to investigate the efficacy of computer-assisted instruction (CAI) with the at-risk student population in New York City. The goals of the program were to identify systems that were effective in increasing student attendance and achievement, and in…

  15. Computer methods for geological analysis of radiometric data

    International Nuclear Information System (INIS)

    Whether an explorationist equates anomalies with potential uranium ore deposits or analyses radiometric data in terms of their relationships with other geochemical, geophysical, and geological data, the anomaly or anomalous zone is the most common starting point for subsequent study or field work. In its preliminary stages, the definition of meaningful anomalies from raw data is a statistical problem requiring the use of a computer. Because radiometric data, when properly collected and reduced, are truly geochemical, they can be expected to relate in part to changes in surface or near-surface geology. Data variations caused strictly by differences in gross chemistry of the lithologies sampled constitute a noise factor which must be removed for proper analysis. Texas Instruments Incorporated has developed an automated method of measuring the statistical significance of data by incorporating geological information in the process. This method of computerized geological analysis of radiometric data (CGARD) is similar to a basic method of the exploration geochemist and has been proved successful in its application to airborne radiometric data collected on four continents by Texas Instruments Incorporated. This beginning and its natural follow-on methods of automated or interpretive analysis are based simply on the perception of radiometric data as sets of statistically distributed data in both the frequency and spatial domains. (author)

  16. Computer codes for the analysis of flask impact problems

    International Nuclear Information System (INIS)

    This review identifies typical features of the design of transportation flasks and considers some of the analytical tools required for the analysis of impact events. Because of the complexity of the physical problem, it is unlikely that a single code will adequately deal with all the aspects of the impact incident. Candidate codes are identified on the basis of current understanding of their strengths and limitations. It is concluded that the HONDO-II, DYNA3D AND ABAQUS codes which ar already mounted on UKAEA computers will be suitable tools for use in the analysis of experiments conducted in the proposed AEEW programme and of general flask impact problems. Initial attention should be directed at the DYNA3D and ABAQUS codes with HONDO-II being reserved for situations where the three-dimensional elements of DYNA3D may provide uneconomic simulations in planar or axisymmetric geometries. Attention is drawn to the importance of access to suitable mesh generators to create the nodal coordinate and element topology data required by these structural analysis codes. (author)

  17. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity. PMID:27396650

  18. Computer-aided photometric analysis of dynamic digital bioluminescent images

    Science.gov (United States)

    Gorski, Zbigniew; Bembnista, T.; Floryszak-Wieczorek, J.; Domanski, Marek; Slawinski, Janusz

    2003-04-01

    The paper deals with photometric and morphologic analysis of bioluminescent images obtained by registration of light radiated directly from some plant objects. Registration of images obtained from ultra-weak light sources by the single photon counting (SPC) technique is the subject of this work. The radiation is registered by use of a 16-bit charge coupled device (CCD) camera "Night Owl" together with WinLight EG&G Berthold software. Additional application-specific software has been developed in order to deal with objects that are changing during the exposition time. Advantages of the elaborated set of easy configurable tools named FCT for a computer-aided photometric and morphologic analysis of numerous series of quantitatively imperfect chemiluminescent images are described. Instructions are given how to use these tools and exemplified with several algorithms for the transformation of images library. Using the proposed FCT set, automatic photometric and morphologic analysis of the information hidden within series of chemiluminescent images reflecting defensive processes in poinsettia (Euphorbia pulcherrima Willd) leaves affected by a pathogenic fungus Botrytis cinerea is revealed.

  19. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  20. Molecular organization in liquid crystals: A comparative computational analysis

    International Nuclear Information System (INIS)

    A comparative computational analysis of molecular organization in four-nematogenic acids (nOCAC) having two, four, six, and eight carbon atoms in the alkyl chain is carried out with respect to translatory and orientational motions. The evaluation of the atomic charge and dipole moment at each atomic center is performed through the complete neglect differential overlap (CNDO/2) method. The Rayleigh-Schroedinger perturbation theory, along with the multicentered-multipole expansion method, is employed to evaluate the long-range interactions, while the '6-exp' potential function is assumed for short-range interactions. The total interaction-energy values obtained through these computations are used to calculate the probability of each configuration at the phase transition temperature via the Maxwell-Boltzmann formula. Further, the flexibility of various configurations is studied in terms of variation of probability due to small departures from the most probable configuration. A comparative picture of molecular parameters, such as the total energy, binding energy, and total dipole moment, is given. An attempt is made to explain the nematogenic behavior of these liquid crystals in terms of their relative order and, thereby, to develop a molecular model for the liquid crystallinity.

  1. Underground tank vitrification: Field-scale experiments and computational analysis

    International Nuclear Information System (INIS)

    In situ vitrification (ISV) is a thermal waste remediation process developed by researchers at Pacific Northwest Laboratory for stabilization and treatment of soils contaminated with hazardous, radioactive, or mixed wastes. Many underground tanks containing radioactive and hazardous chemical wastes at U.S. Department of Energy sites will soon require remediation. Recent development activities have been pursued to determine if the ISV process is applicable to underground storage tanks. As envisioned, ISV will convert the tank, tank contents, and associated contaminated soil to a glass and crystalline block. Development activities include testing and demonstration on three scales and computational modeling and evaluation. In this paper, the authors describe engineering solutions implemented on the field scale to mitigate unique problems posed by ISV of a confined underground structure along with the associated computational analysis. The ISV process, as applied to underground storage tanks, is depicted. The process is similar to ISV of contaminated soils except the tank also melts and forms a metal ingot at the bottom of the melt

  2. Application of Computer Integration Technology for Fire Safety Analysis

    Institute of Scientific and Technical Information of China (English)

    SHI Jianyong; LI Yinqing; CHEN Huchuan

    2008-01-01

    With the development of information technology, the fire safety assessment of whole structure or region based on the computer simulation has become a hot topic. However, traditionally, the concemed studies are performed separately for different objectives and difficult to perform an overall evaluation. A new multi-dimensional integration model and methodology for fire safety assessment were presented and two newly developed integrated systems were introduced to demonstrate the function of integration simulation technology in this paper. The first one is the analysis on the fire-resistant behaviors of whole structure under real fire loads. The second one is the study on fire evaluation and emergency rescue of campus based on geography information technology (GIS). Some practical examples are presented to illuminate the advan-tages of computer integration technology on fire safety assessment and emphasize some problems in the simulation. The results show that the multi-dimensional integration model offers a new way and platform for the integrating fire safety assessment of whole structure or region, and the integrated software developed is the useful engineering tools for cost-saving and safe design.

  3. Role of dielectric medium on benzylidene aniline: A computational analysis

    International Nuclear Information System (INIS)

    A computational analysis of ordering in N-(p-n-ethoxy benzylidene)-p-n-butyl aniline (2O.4) was performed based on quantum mechanics and intermolecular forces. The atomic charge and dipole moment at atomic centre were evaluated using the all valance electron CNDO/2 method. The modified Rayleigh-Schrodinger perturbation theory and multicentre-multipole expansion method were employed to evaluate long-range intermolecular interaction, while a 6-exp potential function was assumed for short-range interactions. The total interaction energy values obtained in these computations were used as input for calculating the probability of each configuration in a noninteracting and nonmesogenic solvent (i.e., benzene) at room temperature (300 K) using the Maxwell-Boltzmann formula. The molecular parameter of 2O.4, including the total energy, binding energy, and total dipole moment were compared with N (p-n-butoxy benzylidene)-p-n-ethyl aniline (4O.2). The present article offer theoretical support to the experimental 'observations, as well as a new and interesting way of looking at liquid crystalline molecule in a dielectric medium.

  4. Feasibility Analysis of Critical Factors Affecting Cloud Computing in Nigeria

    Directory of Open Access Journals (Sweden)

    Eustace Manayi Dogo

    2013-10-01

    Full Text Available Cloud computing is an evolving and new way of delivering computing services and resources over the internet which are managed by third parties at remote sites. Cloud computing is based on existing technologies like web services, Service Oriented Architecture (SOA, web3.0, grid computing and virtualization, etc. Computing services includes data storage, processing and software. Cloud computing is enjoying a lot of buzz in Nigeria due to its perceived economic and operational benefits and stakeholders believe that it will transform the IT industry in Nigeria. Despite all its promises there still exist so many challenges before Cloud computing see the light of the day in Nigeria. This paper delivers an overview of Cloud computing together with its advantages and disadvantages. Thereafter, the challenges and drivers affecting the adoption of Cloud computing in Nigeria are outlined. Finally, recommendations for the adoption of Cloud computing is discussed with Nigeria as a case study.

  5. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  6. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  7. Meta-Analysis and Computer-Mediated Communication.

    Science.gov (United States)

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed. PMID:27154373

  8. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  9. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  10. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    International Nuclear Information System (INIS)

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  11. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    , as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area......The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  12. A Comparative Analysis of some High Performance Computing Technologies

    OpenAIRE

    Minakshi Tripathy; C. R. Tripathy

    2014-01-01

    Computing is an evolutionary process. As part of this evolution, the computing requirements driven by applications have always outpaced the available technology. The system designers have been always seeking for faster and more efficient systems of computing. During the past decade, many different computer systems supporting high performance computing have emerged. Their taxonomy is based on how their processors, memory and interconnect are laid out. Today’s applications requi...

  13. A COMPARATIVE ANALYSIS OF SOME HIGH PERFORMANCE COMPUTING TECHNOLOGIES

    OpenAIRE

    Minakshi Tripathy; C. R. Tripathy

    2015-01-01

    Computing is an evolutionary process. As part of this evolution, the computing requirements driven by applications have always outpaced the available technology. The system designers have been always seeking for faster and more efficient systems of computing. During the past decade, many different computer systems supporting high performance computing have emerged. Their taxonomy is based on how their processors, memory and interconnect are laid out. Today’s applications require high computat...

  14. 基于Delphi的制图课件设计%Engineering Graphics CAI Design Based on Delphi

    Institute of Scientific and Technical Information of China (English)

    蒋先刚; 钟化兰; 涂晓斌

    2001-01-01

    Introduces programming technologies and methods of EngineeringGraphics CAI design based on Delphi, it focus on system configuration and software methods. It also presents methods and skills of using TTreeView component to construct the CAI'S database.%介绍基于Delphi开发环境下的制图课件的设计技术和实现方法。重点介绍画法几何与工程制图CAI课件系统的构造和软件实现方法,提出了用Delphi中的树状显示控件构造和管理制图CAI系统中的数据库设计方法和技巧。

  15. Organic pollution and salt intrusion in Cai Nuoc District, Ca Mau Province, Vietnam.

    Science.gov (United States)

    Tho, Nguyen; Vromant, Nico; Hung, Nguyen Thanh; Hens, Luc

    2006-07-01

    In Ca Mau, Vietnam, farmers converted from rice to shrimp farming, while ignoring the degradation of the aquatic environment. We assessed the seasonal variations in organic pollution of the surface water and salt intrusion in one district and assessed the difference in chemical characteristics of the surface water of shrimp ponds and canals. Several variables reflecting salinity and organic pollution were measured in the wet and dry season. The results show that in the dry season salinity increased to 37.36-42.73 g l(-1) and COD and suspended solids increased to a maximum of 268.7 mg l(-1) and 1312.0 mg l(-1), respectively. In the wet season salinity values of 8.16 to 10.60 g l(-1) were recorded, indicating that salinity could no longer be washed out completely in this season. It is concluded that salinity and suspended solids in the aquatic environment in the Cai Nuoc district are increased by shrimp monoculture, whereas organic pollution is contributed by human population pressure. PMID:16929642

  16. Green's Function Analysis of Periodic Structures in Computational Electromagnetics

    Science.gov (United States)

    Van Orden, Derek

    2011-12-01

    Periodic structures are used widely in electromagnetic devices, including filters, waveguiding structures, and antennas. Their electromagnetic properties may be analyzed computationally by solving an integral equation, in which an unknown equivalent current distribution in a single unit cell is convolved with a periodic Green's function that accounts for the system's boundary conditions. Fast computation of the periodic Green's function is therefore essential to achieve high accuracy solutions of complicated periodic structures, including analysis of modal wave propagation and scattering from external sources. This dissertation first presents alternative spectral representations of the periodic Green's function of the Helmholtz equation for cases of linear periodic systems in 2D and 3D free space and near planarly layered media. Although there exist multiple representations of the periodic Green's function, most are not efficient in the important case where the fields are observed near the array axis. We present spectral-spatial representations for rapid calculation of the periodic Green's functions for linear periodic arrays of current sources residing in free space as well as near a planarly layered medium. They are based on the integral expansion of the periodic Green's functions in terms of the spectral parameters transverse to the array axis. These schemes are important for the rapid computation of the interaction among unit cells of a periodic array, and, by extension, the complex dispersion relations of guided waves. Extensions of this approach to planar periodic structures are discussed. With these computation tools established, we study the traveling wave properties of linear resonant arrays placed near surfaces, and examine the coupling mechanisms that lead to radiation into guided waves supported by the surface. This behavior is especially important to understand the properties of periodic structures printed on dielectric substrates, such as periodic

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  18. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  19. The role of computed tomography in terminal ballistic analysis.

    Science.gov (United States)

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles. PMID:17205351

  20. A Comparative Analysis of some High Performance Computing Technologies

    Directory of Open Access Journals (Sweden)

    Minakshi Tripathy

    2014-10-01

    Full Text Available Computing is an evolutionary process. As part of this evolution, the computing requirements driven by applications have always outpaced the available technology. The system designers have been always seeking for faster and more efficient systems of computing. During the past decade, many different computer systems supporting high performance computing have emerged. Their taxonomy is based on how their processors, memory and interconnect are laid out. Today’s applications require high computational power as well as high communication performance. The high performance computing provides an approach to parallel processing that yields super computer level performance solving incredibly large and complex problems. This trend makes it very promising to build high performance computing environment with a cost effective approach.

  1. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  2. The effectiveness of a computer-assisted instruction programme on communication skills of medical specialists in oncology.

    OpenAIRE

    Hulsman, R.L.; Ros, W.J.G.; Winnubst, J.A.M.; Bensing, J.

    2002-01-01

    Although doctor-patient communication is important in health care, medical specialists are generally not well trained in communication skills. Conventional training programmes are generally time consuming and hard to fit into busy working schedules of medical specialists. A computer-assisted instruction (CAI) programme was developed -'Interact-Cancer' - which is a time-efficient learning method and easily accessible at the workplace. To investigate the effect of the CAI training, 'Interact-Ca...

  3. CAVASS: a computer-assisted visualization and analysis software system.

    Science.gov (United States)

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-11-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-source toolkits. The development of CAVASS by our group is the next generation of 3DVIEWNIX. CAVASS will be freely available and open source, and it is integrated with toolkits such as Insight Toolkit and Visualization Toolkit. CAVASS runs on Windows, Unix, Linux, and Mac but shares a single code base. Rather than requiring expensive multiprocessor systems, it seamlessly provides for parallel processing via inexpensive clusters of work stations for more time-consuming algorithms. Most importantly, CAVASS is directed at the visualization, processing, and analysis of 3-dimensional and higher-dimensional medical imagery, so support for digital imaging and communication in medicine data and the efficient implementation of algorithms is given paramount importance. PMID:17786517

  4. ANALYSIS OF INTEGRATED COMPUTER AIDED DESIGN SYSTEMS FOR CONSTRUCTION OBJECTS

    OpenAIRE

    Pavlov Aleksandr Sergeevich; Lavdansky Pavel Aleksandrovich; Ignatiev Oleg Vladimirovich

    2012-01-01

    The paper contains classification of construction computer aided design systems, requirements to the single file format, providing their interaction within the frames of creation of integrated computer aided design systems, main principles and data transformation schemes are given.

  5. High-resolution computed tomography of the lung in smokers: Visual and computer-based analysis

    International Nuclear Information System (INIS)

    Purpose: It has been the aim of the study to assess parenchymal changes in the lung with high-resolution CT in healthy heavy, moderate, and non-smokers. Material and methods: We prospectively evaluated CT changes in 42 healthy heavy smokers (gr. (group) 2, ≥30 pack-years), 40 moderate smokers (gr. 1, R, Kontron GmbH, Munich, Germany). Results: Productive cough, dyspnoea and chronic bronchitis were more common in smokers than in non-smokers (p<0.05). Pathological CT-findings were found in 6/38 non-smokers and in 71/82 smokers (p<0.01). In particular, in smokers (gr. 1 [%[, gr. 2 [%[) the following pathological findings were found: Dystelectases in dependent lung areas in 50% (62, 38), centrilobular emphysema in 44% (43, 20), pleural thickening in 38% (38, 38), panlobular emphysema in 36% (52, 20), ground-glass pattern in 33% (36, 30), paraseptal emphysema in 21% (31, 10), prominent or thickened interlobular septa in 18% (29, 8) and centrilobular micronodules in 13% (10, 18). Computer-based analysis demonstrated thicker bronchial walls in smokers as compared to non-smokers. Conclusion: Although feeling healthy, smokers demonstrate various parenchymal abnormalities in the lung. In smokers, subpleural dystelectases, centrilobular and panlobular emphysema are dependent on cigarette consumption, ground glass pattern centrilobular micronodules, pleural thickening and bronchial wall thickening are independent on cigarette consumption. (orig.)

  6. Computer based imaging and analysis of root gravitropism

    Science.gov (United States)

    Evans, M. L.; Ishikawa, H.

    1997-01-01

    Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ.

  7. Genome Assembly and Computational Analysis Pipelines for Bacterial Pathogens

    KAUST Repository

    Rangkuti, Farania Gama Ardhina

    2011-06-01

    Pathogens lie behind the deadliest pandemics in history. To date, AIDS pandemic has resulted in more than 25 million fatal cases, while tuberculosis and malaria annually claim more than 2 million lives. Comparative genomic analyses are needed to gain insights into the molecular mechanisms of pathogens, but the abundance of biological data dictates that such studies cannot be performed without the assistance of computational approaches. This explains the significant need for computational pipelines for genome assembly and analyses. The aim of this research is to develop such pipelines. This work utilizes various bioinformatics approaches to analyze the high-­throughput genomic sequence data that has been obtained from several strains of bacterial pathogens. A pipeline has been compiled for quality control for sequencing and assembly, and several protocols have been developed to detect contaminations. Visualization has been generated of genomic data in various formats, in addition to alignment, homology detection and sequence variant detection. We have also implemented a metaheuristic algorithm that significantly improves bacterial genome assemblies compared to other known methods. Experiments on Mycobacterium tuberculosis H37Rv data showed that our method resulted in improvement of N50 value of up to 9697% while consistently maintaining high accuracy, covering around 98% of the published reference genome. Other improvement efforts were also implemented, consisting of iterative local assemblies and iterative correction of contiguated bases. Our result expedites the genomic analysis of virulent genes up to single base pair resolution. It is also applicable to virtually every pathogenic microorganism, propelling further research in the control of and protection from pathogen-­associated diseases.

  8. A conversational system for the computer analysis of nucleic acid sequences.

    OpenAIRE

    Sege, R; Söll, D.; Ruddle, F H; Queen, C

    1981-01-01

    We present a conversational system for the computer analysis of nucleic acid and protein sequences based on the well-known Queen and Korn program (1). The system can be used by persons with only minimal knowledge of computers.

  9. Analysis on Cloud Computing Information Security Problems and the Countermeasures

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cloud computing is one of the most popular terms in the present IT industry, as well as one of the most prosperous technology. This paper introduces the concept, principle and characteristics of cloud computing, analyzes information security problems resulted from cloud computing, and puts forward corresponding solutions.

  10. Computer analysis and comparison of chess players' game-playing styles

    OpenAIRE

    Krevs, Urša

    2015-01-01

    Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...

  11. Parallelizing Genetic Linkage Analysis: A Case Study for Applying Parallel Computation in Molecular Biology

    OpenAIRE

    Nadkarni, Prakash; Gelernter, Joel E.; Carriero, Nicholas; Pakstis, Andrew J.; Kidd, Kenneth K.; Miller, Perry L.

    1990-01-01

    Parallel computers offer a solution to improve the lengthy computation time of many conventional, sequential programs used in molecular biology. On a parallel computer, different pieces of the computation are performed simultaneously on different processors. LINKMAP is a sequential program widely used by scientists to perform genetic linkage analysis. We have converted LINKMAP to run on a parallel computer, using the machine-independent parallel programming language, Linda. Using the parallel...

  12. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. Computer-aided pulmonary image analysis in small animal models

    International Nuclear Information System (INIS)

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases

  15. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  16. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  17. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  18. Computational analysis on plug-in hybrid electric motorcycle chassis

    Science.gov (United States)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  19. Design of airborne wind turbine and computational fluid dynamics analysis

    Science.gov (United States)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.

  20. Computational analysis of molt-inhibiting hormone from selected crustaceans.

    Science.gov (United States)

    C, Kumaraswamy Naidu; Y, Suneetha; P, Sreenivasula Reddy

    2013-12-01

    Molt-inhibiting hormone (MIH) is a principal endocrine hormone regulating the growth in crustaceans. In total, nine MIH peptide sequences representing members of the family Penaeidae (Penaeus monodon, Litopenaeus vannamei, Marsupenaeus japonicus), Portunidae (Portunus trituberculatus, Charybdis japonica, Charybdis feriata), Cambaridae (Procambarus bouvieri), Parastacidae (Cherax quadricarinatus) and Varunidae (Eriocheir sinensis) were selected for our study. In order to develop a structure based phylogeny, predict functionally important regions and to define stability changes upon single site mutations, the 3D structure of MIH for the crustaceans were built by using homology modeling based on the known structure of MIH from M. japonicus (1J0T). Structure based phylogeny showed a close relationship between P. bouvieri and C. japonica. ConSurf server analysis showed that the residues Cys(8), Arg(15), Cys(25), Asp(27), Cys(28), Asn(30), Arg(33), Cys(41), Cys(45), Phe(51), and Cys(54) may be functionally significant among the MIH of crustaceans. Single amino acid substitutions 'Y' and 'G' at the positions 71 and 72 of the MIH C-terminal region showed an alteration in the stability indicating that a change in this region may alter the function of MIH. In conclusion, we proposed a computational approach to analyze the structure, phylogeny and stability of MIH from crustaceans. PMID:24041714

  1. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  2. Reliability and safety analysis of redundant vehicle management computer system

    Institute of Scientific and Technical Information of China (English)

    Shi Jian; Meng Yixuan; Wang Shaoping; Bian Mengmeng; Yan Dungong

    2013-01-01

    Redundant techniques are widely adopted in vehicle management computer (VMC) to ensure that VMC has high reliability and safety. At the same time, it makes VMC have special char-acteristics, e.g., failure correlation, event simultaneity, and failure self-recovery. Accordingly, the reliability and safety analysis to redundant VMC system (RVMCS) becomes more difficult. Aimed at the difficulties in RVMCS reliability modeling, this paper adopts generalized stochastic Petri nets to establish the reliability and safety models of RVMCS. Then this paper analyzes RVMCS oper-ating states and potential threats to flight control system. It is verified by simulation that the reli-ability of VMC is not the product of hardware reliability and software reliability, and the interactions between hardware and software faults can reduce the real reliability of VMC obviously. Furthermore, the failure undetected states and false alarming states inevitably exist in RVMCS due to the influences of limited fault monitoring coverage and false alarming probability of fault mon-itoring devices (FMD). RVMCS operating in some failure undetected states will produce fatal threats to the safety of flight control system. RVMCS operating in some false alarming states will reduce utility of RVMCS obviously. The results abstracted in this paper can guide reliable VMC and efficient FMD designs. The methods adopted in this paper can also be used to analyze other intelligent systems’ reliability.

  3. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  4. A computer language for reducing activation analysis data

    International Nuclear Information System (INIS)

    A program, written in FORTRAN, which defines a language for reducing activation analysis data is described. An attempt was made to optimize the choice of commands and their definitions so as to concisely express what should be done, make the language natural to use and easy to learn, arranqe a system of checks to guard against communication errors and have the language be inclusive. Communications are effected through commands, and these can be given in almost any order. Consistency checks are done and diagnostic messages are printed automatically to guard against the incorrect use of commands. Default options on the commands allow instructions to be expressed concisely while providing a capability to specify details for the data reduction process. The program has been implemented on a UNIVAC 1108 computer. A complete description of the commands, the algorithms used, and the internal consistency checks used are given elsewhere. The applications of the program and the methods for obtaining data automatically have already been described. (T.G.)

  5. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  6. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author)

  7. Analisis cualitativo asistido por computadora Computer-assisted qualitative analysis

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2003-01-01

    Full Text Available Los objetivos de este ensayo son: por un lado, presentar una aproximación a la experiencia hispanoamericana en el Análisis Cualitativo Asistido por Computadora (ACAC al agrupar mediante un ejercicio de sistematización los trabajos realizados por diversos colegas provenientes de disciplinas afines. Aunque hubiese querido ser exhaustivo y minucioso, como cualquier intento de sistematización de experiencias, en este ejercicio son notables las ausencias y las omisiones. Introducir algunas reflexiones teóricas en torno al papel del ACAC en el desarrollo de la investigación cualitativa a partir de esa sistematización y con particular énfasis en la producción del dato es, por otro lado, objetivo central de esta primera aproximación.The aims of this article are: on the one hand, to present an approximation to the Hispano-American experience on Computer-Assisted Qualitative Data Analysis (CAQDAS, grouping as a systematization exercise the works carried out by several colleagues from related disciplines. Although attempting to be exhaustive and thorough - as in any attempt at systematizing experiences - this exercise presents clear lacks and omissions. On the other hand, to introduce some theoretical reflections about the role played by CAQDAS in the development of qualitative investigation after that systematization, with a specific focus on data generation.

  8. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    OpenAIRE

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to e...

  9. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  10. A citation analysis of top research papers of computer science

    OpenAIRE

    Hussain, Akhtar; Swain, Dillip-K.

    2011-01-01

    The study intends to evaluate the top papers of Computer Science as reflected in Science Direct. Moreover, it aims to find out authorship pattern, ranking of authors, ranking of country productivity, ranking of journals, and highly cited papers of Computer Science. The citations data have been collected from the quarterly list of hottest 25 research articles in the subject field of Computer Science from Science Direct database. In the present study, 20 issues of the alert service beginning fr...

  11. Analysis of activities for learning computer science unplugged

    OpenAIRE

    Zaviršek, Manca

    2013-01-01

    In following thesis I delve into activities for learning computer science unplugged available at Vidra website. Some activities are analyzed on the basis of learning objectives of Slovenian primary school curriculum for computer science and ACM K-12 Computer Science Curriculum. The main objective of this thesis is to estimate how much the activities match both curriculums. Within the thesis I analyze the goals of those activities in correlation to revised Bloom's taxonomy. By means...

  12. Computational analysis of irradiation facilities at the JSI TRIGA reactor.

    Science.gov (United States)

    Snoj, Luka; Zerovnik, Gašper; Trkov, Andrej

    2012-03-01

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. PMID:22154389

  13. Computational modeling and impact analysis of textile composite structures

    Science.gov (United States)

    Hur, Hae-Kyu

    This study is devoted to the development of an integrated numerical modeling enabling one to investigate the static and the dynamic behaviors and failures of 2-D textile composite as well as 3-D orthogonal woven composite structures weakened by cracks and subjected to static-, impact- and ballistic-type loads. As more complicated modeling about textile composite structures is introduced, some of homogenization schemes, geometrical modeling and crack propagations become more difficult problems to solve. To overcome these problems, this study presents effective mesh-generation schemes, homogenization modeling based on a repeating unit cell and sinusoidal functions, and also a cohesive element to study micro-crack shapes. This proposed research has two: (1) studying behavior of textile composites under static loads, (2) studying dynamic responses of these textile composite structures subjected to the transient/ballistic loading. In the first part, efficient homogenization schemes are suggested to show the influence of textile architectures on mechanical characteristics considering the micro modeling of repeating unit cell. Furthermore, the structures of multi-layered or multi-phase composites combined with different laminar such as a sub-laminate, are considered to find the mechanical characteristics. A simple progressive failure mechanism for the textile composites is also presented. In the second part, this study focuses on three main phenomena to solve the dynamic problems: micro-crack shapes, textile architectures and textile effective moduli. To obtain a good solutions of the dynamic problems, this research attempts to use four approaches: (I) determination of governing equations via a three-level hierarchy: micro-mechanical unit cell analysis, layer-wise analysis accounting for transverse strains and stresses, and structural analysis based on anisotropic plate layers, (II) development of an efficient computational approach enabling one to perform transient

  14. Computer Models for IRIS Control System Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  15. Computer Models for IRIS Control System Transient Analysis

    International Nuclear Information System (INIS)

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section

  16. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image. This is a...... generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation and...

  17. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  18. An Analysis Of Underlying Competencies And Computer And Information Technology Learning Objectives For Business Analysis

    OpenAIRE

    Quigley, Ryan Thomas

    2013-01-01

    This research examines whether the Computer and Information Technology (CIT) department at Purdue University should develop a business analyst concentration. The differences between system and business analysts, evolution of the business analyst profession, job demand and trends, and applicable model curricula were explored to support this research. Review of relevant literature regarding the topics suggested that a business analyst concentration should be developed. A gap analysis was perfor...

  19. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  20. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  1. Analysis of high-tech methods of illegal remote computer data access

    OpenAIRE

    Polyakov, V. V.; Slobodyan, S. М.

    2007-01-01

    The analysis of high-tech methods of committing crimes in the sphere of computer information has been performed. The crimes were practically committed from remote computers. Virtual traces left at realisation of such methods are revealed. Specific proposals in investigation and prevention of the given type computer entry are developed.

  2. The methods and computer structures for adaptive Fourier descriptive image analysis

    OpenAIRE

    V.Perzhu; A. Gurau

    1997-01-01

    New architectures of image processing computer systems, based on the algorithms of Fourier - descriptive (FD) analysis have been developed. A new computing processes organisation method on the basis of FD image features has been proposed. The structures of two problem-oriented optical-electronic computer systems have been developed. The estimation of time expenditures in the systems have been carried out.

  3. Interface design of VSOP'94 computer code for safety analysis

    International Nuclear Information System (INIS)

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects

  4. Computer assisted sound analysis of arteriovenous fistula in hemodialysis patients.

    Science.gov (United States)

    Malindretos, Pavlos; Liaskos, Christos; Bamidis, Panagiotis; Chryssogonidis, Ioannis; Lasaridis, Anastasios; Nikolaidis, Pavlos

    2014-02-01

    The purpose of this study was to reveal the unique sound characteristics of the bruit produced by arteriovenous fistulae (AVF), using a computerized method. An electronic stethoscope (20 Hz to 20 000 Hz sensitivity) was used, connected to a portable laptop computer. Forty prevalent hemodialysis patients participated in the study. All measurements were made with patients resting in supine position, prior to the initiation of mid-week dialysis session. Standard color Doppler technique was used to estimate blood flow. Clinical examination revealed the surface where the perceived bruit was more intense, and the recording took place at a sample rate of 22 000 Hz in WAV lossless format. Fast Fourier Transform (FFT) mathematical algorithm, was used for the sound analysis. This algorithm is particularly useful in revealing the periodicity of sound data as well as in mapping its frequency behavior and its strength. Produced frequencies were divided into 40 frequency intervals, 250 Hz apart, so that the results would be easier to plot and comprehend. The mean age of the patients was 63.5 ± 14 years; the median time on dialysis was 39.6 months (mean 1 month, max. 200 months). The mean blood flow was 857.7 ± 448.3 ml/min. The mean sound frequency was approximately 5 500 Hz ± 4 000 Hz and the median, which is also expressing the major peak of sound data, was 750 Hz, varying from 250 Hz to 10 000 Hz. A possible limitation of the study is the relatively small number of participants. PMID:24619890

  5. Computational identification and analysis of novel sugarcane microRNAs

    Directory of Open Access Journals (Sweden)

    Thiebaut Flávia

    2012-07-01

    Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.

  6. Analysis of Drafting Effects in Swimming Using Computational Fluid Dynamics

    Science.gov (United States)

    Silva, António José; Rouboa, Abel; Moreira, António; Reis, Victor Machado; Alves, Francisco; Vilas-Boas, João Paulo; Marinho, Daniel Almeida

    2008-01-01

    The purpose of this study was to determine the effect of drafting distance on the drag coefficient in swimming. A k-epsilon turbulent model was implemented in the commercial code Fluent® and applied to the fluid flow around two swimmers in a drafting situation. Numerical simulations were conducted for various distances between swimmers (0.5-8.0 m) and swimming velocities (1.6-2.0 m.s-1). Drag coefficient (Cd) was computed for each one of the distances and velocities. We found that the drag coefficient of the leading swimmer decreased as the flow velocity increased. The relative drag coefficient of the back swimmer was lower (about 56% of the leading swimmer) for the smallest inter-swimmer distance (0.5 m). This value increased progressively until the distance between swimmers reached 6.0 m, where the relative drag coefficient of the back swimmer was about 84% of the leading swimmer. The results indicated that the Cd of the back swimmer was equal to that of the leading swimmer at distances ranging from 6.45 to 8. 90 m. We conclude that these distances allow the swimmers to be in the same hydrodynamic conditions during training and competitions. Key pointsThe drag coefficient of the leading swimmer decreased as the flow velocity increased.The relative drag coefficient of the back swimmer was least (about 56% of the leading swimmer) for the smallest inter-swimmer distance (0.5 m).The drag coefficient values of both swimmers in drafting were equal to distances ranging between 6.45 m and 8.90 m, considering the different flow velocities.The numerical simulation techniques could be a good approach to enable the analysis of the fluid forces around objects in water, as it happens in swimming. PMID:24150135

  7. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  8. Analysis of Scheduling Algorithms in Grid Computing Environment

    Directory of Open Access Journals (Sweden)

    Farhad Soleimanian Gharehchopogh

    2013-11-01

    Full Text Available Grid Computing is the technology of dividing computer networks with different and heterogeneous resources based on distribution computing. Grid computing has no limitation due to its geographical domain and the type of undercover resources. Generally, a grid network can be considered as a series of several big branches, different kinds of microprocessors, thousands of PC computers and workstations in all over the world. The goal of grid computing is to apply available computing resources easily for complicated calculations vie sites which are distributed geographically. In another words, the least cost for many users is to support parallelism, minimize the time of task operation and so on in scientific, trade and industrial contexts. To reach the goal, it is necessary to use an efficient scheduling system as a vital part for grid environment. Generally, scheduling plays very important role in grid networks. So, selecting the type of scheduling algorithm has an important role in optimizing the reply and waiting time which involve as two important factors. As providing scheduling algorithms which can minimize tasks runtime and increase operational power has remarkable importance in these categories. In this paper, we discuss about scheduling algorithms which involve independent algorithms such as Minimum Execution Time, Minimum Completion Time, Min-min, Max-min and XSuffrage.

  9. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  10. The analysis of control trajectories using symbolic and database computing

    Science.gov (United States)

    Grossman, Robert

    1995-01-01

    This final report comprises the formal semi-annual status reports for this grant for the periods June 30-December 31, 1993, January 1-June 30, 1994, and June 1-December 31, 1994. The research supported by this grant is broadly concerned with the symbolic computation, mixed numeric-symbolic computation, and database computation of trajectories of dynamical systems, especially control systems. A review of work during the report period covers: trajectories and approximating series, the Cayley algebra of trees, actions of differential operators, geometrically stable integration algorithms, hybrid systems, trajectory stores, PTool, and other activities. A list of publications written during the report period is attached.

  11. Computing realization of group analysis of FSI problems

    International Nuclear Information System (INIS)

    Following the rational thought, that the computational realization of the mechanical-mathematical modelling of thin walled structural components has to supplement harmonically any powerful theoretical (analytical) methods, an investigation containing the three components: (i) Powerful analytical method (Lie group method); (ii) computational method and numerical realization of the analytical investigation including the code 'MAYA-MAXI' (to construct the corresponding group and compathability conditions of the FSI problem); (iii) results of an example for computational design of a circular thinwalled nonlinear shell (of PWR core), is performed in the present paper. (orig.)

  12. A Cost-Benefit Analysis of a Campus Computing Grid

    OpenAIRE

    Smith, Preston M.

    2011-01-01

    Any major research institution has a substantial number of computer systems on its campus, often in the scale of tens of thousands. Given that a large amount of scientific computing is appropriate for execution in an opportunistic environment, a campus grid is an inexpensive way to build a powerful computational resource. What is missing, though, is a model for making an informed decision on the cost-effectives of a campus grid. In this thesis, the author describes a model for measuring the c...

  13. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    Planimetrical measurements are made to calculate the area of an entity. By digitizing the entity the planimetrical measurements may be done by computer. This computer program was developed in conjunction with a research project involving measurement of the pneumatized cell system of the temporal...... bones as seen on X-rays. By placing the X-rays on a digitizer tablet and tracing the outline of the cell system, the area was calculated by the program. The calculated data and traced images could be stored and printed. The program is written in BASIC; necessary hardware is an IBM-compatible personal...... computer, a digitizer tablet and a printer....

  14. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per;

    2016-01-01

    -cost embedded computer with very limited computational resources as compared to an ordinary PC. The system succeeds in counting honeybees, identifying their position and measuring their in-and-out activity. Our algorithm uses background subtraction method to segment the images. After the segmentation stage, the...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  15. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  16. Computational Surprisal Analysis Speeds-Up Genomic Characterization of Cancer Processes

    OpenAIRE

    Kravchenko-Balasha, N.; Simon, Simcha; Levine, R. D.; Remacle, Françoise; Exman, Iaakov

    2014-01-01

    Surprisal analysis is increasingly being applied for the examination of transcription levels in cellular processes, towards revealing inner network structures and predicting response. But to achieve its full potential, surprisal analysis should be integrated into a wider range computational tool. The purposes of this paper are to combine surprisal analysis with other important computation procedures, such as easy manipulation of the analysis results – e.g. to choose desirable result sub-sets ...

  17. Sensitivity Analysis and Error Control for Computational Aeroelasticity Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposal is the development of a next-generation computational aeroelasticity code, suitable for real-world complex geometries, and...

  18. Finite Element Analysis in Concurrent Processing: Computational Issues

    Science.gov (United States)

    Sobieszczanski-Sobieski, Jaroslaw; Watson, Brian; Vanderplaats, Garrett

    2004-01-01

    The purpose of this research is to investigate the potential application of new methods for solving large-scale static structural problems on concurrent computers. It is well known that traditional single-processor computational speed will be limited by inherent physical limits. The only path to achieve higher computational speeds lies through concurrent processing. Traditional factorization solution methods for sparse matrices are ill suited for concurrent processing because the null entries get filled, leading to high communication and memory requirements. The research reported herein investigates alternatives to factorization that promise a greater potential to achieve high concurrent computing efficiency. Two methods, and their variants, based on direct energy minimization are studied: a) minimization of the strain energy using the displacement method formulation; b) constrained minimization of the complementary strain energy using the force method formulation. Initial results indicated that in the context of the direct energy minimization the displacement formulation experienced convergence and accuracy difficulties while the force formulation showed promising potential.

  19. Spreadsheet Analysis Of Queuing In A Computer Network

    Science.gov (United States)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  20. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  1. An Analysis of Chinese Laws Against Computer Crimes

    OpenAIRE

    Hakman A. Wan; Ming-Te Lu

    1997-01-01

    An overview of the computer crime and related legislation in the People’s Republic of China is given. Relevant laws and their interpretation by Chinese legal scholars with respect to negligence, trade secrets, copyright and piracy, data protection and privacy are presented. The unique aspects of the Chinese legal system are accentuated. Due to the differences in the cultural, political, and legal environments, the PRC courts may treat some computer crimes more severely and may hand out pena...

  2. Multi-scale analysis of lung computed tomography images

    OpenAIRE

    Gori, I.; Bagagli, F.; Fantacci, M. E.; Martinez, A. Preite; Retico, A.; De Mitri, I.; Donadio, S.; Fulcheri, C.; Gargano, G; Magro, R.; Santoro, M; Stumbo, S

    2009-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on...

  3. Review and analysis of Cloud Computing Quality of Experience

    OpenAIRE

    Safdari, Fash; Chang, Victor

    2014-01-01

    Cloud computing is gaining growing interest from industry, organizations and the research community. The technology promises many advantages in terms of cost saving by allowing organizations and users to remotely gain access to a huge pool of data storage and processing power. However, migrating services to cloud computing introduce new challenges in performance and service quality. Quality of service (QoS) is and has been used as a means of monitoring cloud service performance. However, QoS...

  4. Cohomology of Lie Superalgebras of Hamiltonian Vector Fields: Computer Analysis

    OpenAIRE

    Kornyak, Vladimir V.

    1999-01-01

    We present the results of computation of cohomology for some Lie (super)algebras of Hamiltonian vector fields and related algebras. At present, the full cohomology rings for these algebras are not known even for the low dimensional vector fields. The partial ``experimental'' results may give some hints for solution of the whole problem. The computations have been carried out with the help of recently written program in C language. Some of the presented results are new.

  5. Computational analysis of promoters and DNA-protein interactions

    OpenAIRE

    Tomovic, Andrija

    2009-01-01

    The investigation of promoter activity and DNA-protein interactions is very important for understanding many crucial cellular processes, including transcription, recombination and replication. Promoter activity and DNA-protein interactions can be studied in the lab (in vitro or in vivo) or using computational methods (in silico). Computational approaches for analysing promoters and DNA-protein interactions have become more powerful as more and more complete genome sequences, 3D...

  6. Analysis of user interfaces and interactions with computers

    OpenAIRE

    PERČIČ, JAN

    2016-01-01

    The diploma thesis is a study of evolution of user interfaces and human interaction with computers. The thesis offers an overview of examples from history and mentions people that were important for the user interface development. At the same time it examines the current interaction principles and their potential evolution in the future. The goal was to define a potential ideal user interface, but because we are using different types of computers and in different situations, conclusion was re...

  7. Computational analysis of difenoconazole interaction with soil chitinases

    International Nuclear Information System (INIS)

    This study focusses on the investigation of the potential binding of the fungicide difenoconazole to soil chitinases using a computational approach. Computational characterization of the substrate binding sites of Serratia marcescens and Bacillus cereus chitinases using Fpocket tool reflects the role of hydrophobic residues for the substrate binding and the high local hydrophobic density of both sites. Molecular docking study reveals that difenoconazole is able to bind to Serratia marcescens and Bacillus cereus chitinases active sites, the binding energies being comparable

  8. Review and Analysis of Networking Challenges in Cloud Computing

    OpenAIRE

    Moura, Jose Andre; Hutchison, David

    2016-01-01

    Cloud Computing offers virtualized computing, storage, and networking resources, over the Internet, to organizations and individual users in a completely dynamic way. These cloud resources are cheaper, easier to manage, and more elastic than sets of local, physical, ones. This encourages customers to outsource their applications and services to the cloud. The migration of both data and applications outside the administrative domain of customers into a shared environment imposes transversal, f...

  9. mlegp: statistical analysis for computer models of biological systems using R

    OpenAIRE

    Dancik, Garrett M.; Dorman, Karin S

    2008-01-01

    Summary: Gaussian processes (GPs) are flexible statistical models commonly used for predicting output from complex computer codes. As such, GPs are well suited for the analysis of computer models of biological systems, which have been traditionally difficult to analyze due to their high-dimensional, non-linear and resource-intensive nature. We describe an R package, mlegp, that fits GPs to computer model outputs and performs sensitivity analysis to identify and characterize the effects of imp...

  10. Turing machines on represented sets, a model of computation for Analysis

    OpenAIRE

    Tavana, Nazanin; Weihrauch, Klaus

    2011-01-01

    We introduce a new type of generalized Turing machines (GTMs), which are intended as a tool for the mathematician who studies computability in Analysis. In a single tape cell a GTM can store a symbol, a real number, a continuous real function or a probability measure, for example. The model is based on TTE, the representation approach for computable analysis. As a main result we prove that the functions that are computable via given representations are closed under GTM programming. This gener...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. Computer-Assisted Law Instruction: Clinical Education's Bionic Sibling

    Science.gov (United States)

    Henn, Harry G.; Platt, Robert C.

    1977-01-01

    Computer-assisted instruction (CAI), like clinical education, has considerable potential for legal training. As an initial Cornell Law School experiment, a lesson in applying different corporate statutory dividend formulations, with a cross-section of balance sheets and other financial data, was used to supplement regular class assignments.…

  13. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  14. CAPRI: A Geometric Foundation for Computational Analysis and Design

    Science.gov (United States)

    Haimes, Robert

    2006-01-01

    CAPRI is a software building tool-kit that refers to two ideas; (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. A complete definition of the geometry and application programming interface can be found in the document CAPRI: Computational Analysis PRogramming Interface appended to this report. In summary the interface is subdivided into the following functional components: 1. Utility routines -- These routines include the initialization of CAPRI, loading CAD parts and querying the operational status as well as closing the system down. 2. Geometry data-base queries -- This group of functions allow all top level applications to figure out and get detailed information on any geometric component in the Volume definition. 3. Point queries -- These calls allow grid generators, or solvers doing node adaptation, to snap points directly onto geometric entities. 4. Calculated or geometrically derived queries -- These entry points calculate data from the geometry to aid in grid generation. 5. Boundary data routines -- This part of CAPRI allows general data to be attached to Boundaries so that the boundary conditions can be specified and stored within CAPRI s data-base. 6. Tag based routines -- This part of the API allows the specification of properties associated with either the Volume (material properties) or Boundary (surface properties) entities. 7. Geometry based interpolation routines -- This part of the API facilitates Multi-disciplinary coupling and allows zooming through Boundary Attachments. 8. Geometric creation and manipulation -- These calls facilitate constructing simple solid entities and perform the Boolean solid operations. Geometry constructed in this manner has the advantage that if the data is kept consistent with the CAD package, therefore a new design can be incorporated directly and is manufacturable. 9

  15. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  16. Nondestructive analysis of urinary calculi using micro computed tomography

    Directory of Open Access Journals (Sweden)

    Lingeman James E

    2004-12-01

    Full Text Available Abstract Background Micro computed tomography (micro CT has been shown to provide exceptionally high quality imaging of the fine structural detail within urinary calculi. We tested the idea that micro CT might also be used to identify the mineral composition of urinary stones non-destructively. Methods Micro CT x-ray attenuation values were measured for mineral that was positively identified by infrared microspectroscopy (FT-IR. To do this, human urinary stones were sectioned with a diamond wire saw. The cut surface was explored by FT-IR and regions of pure mineral were evaluated by micro CT to correlate x-ray attenuation values with mineral content. Additionally, intact stones were imaged with micro CT to visualize internal morphology and map the distribution of specific mineral components in 3-D. Results Micro CT images taken just beneath the cut surface of urinary stones showed excellent resolution of structural detail that could be correlated with structure visible in the optical image mode of FT-IR. Regions of pure mineral were not difficult to find by FT-IR for most stones and such regions could be localized on micro CT images of the cut surface. This was not true, however, for two brushite stones tested; in these, brushite was closely intermixed with calcium oxalate. Micro CT x-ray attenuation values were collected for six minerals that could be found in regions that appeared to be pure, including uric acid (3515 – 4995 micro CT attenuation units, AU, struvite (7242 – 7969 AU, cystine (8619 – 9921 AU, calcium oxalate dihydrate (13815 – 15797 AU, calcium oxalate monohydrate (16297 – 18449 AU, and hydroxyapatite (21144 – 23121 AU. These AU values did not overlap. Analysis of intact stones showed excellent resolution of structural detail and could discriminate multiple mineral types within heterogeneous stones. Conclusions Micro CT gives excellent structural detail of urinary stones, and these results demonstrate the feasibility

  17. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  18. Secure distributed genome analysis for GWAS and sequence comparison computation

    Science.gov (United States)

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  19. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  20. CAE - nuclear engineering analysis on work-station computers

    International Nuclear Information System (INIS)

    Emergence of the inexpensive and widely available 32-bit-work-station computer is revolutionizing the scientific and engineering computing environment. These systems reach or exceed threshold for many midscale nuclear applications and bridge the gap between the era of expensive computing: cheap people and the era of cheap computing: expensive people. Experience at the Idaho National Engineering Laboratory (INEL) has demonstrated the efficacy of this new computer technology. For the past 1 1/2 yr, a Hewlett-Packard 9000/540 32-bit multi-user microcomputer has been used to perform many calculations typical of a nuclear design effort. This system is similar with respect to performance and memory to such work stations as the SUN-3, HP-9000/32, or the Apollo DN-3000 that are available for under $20,000 for a fully configured single-user station. The system is being used for code development, model setup and checkout, and a full range of nuclear applications. Various one- and two-dimensional discrete ordinates transport codes are used on a routine basis. These include the well-known ANISN code as well as locally developed transport models. Typical one-dimensional multigroup calculations can be executed in clock times <10 min

  1. An Information Theoretic Analysis of Decision in Computer Chess

    CERN Document Server

    Godescu, Alexandru

    2011-01-01

    The basis of the method proposed in this article is the idea that information is one of the most important factors in strategic decisions, including decisions in computer chess and other strategy games. The model proposed in this article and the algorithm described are based on the idea of a information theoretic basis of decision in strategy games . The model generalizes and provides a mathematical justification for one of the most popular search algorithms used in leading computer chess programs, the fractional ply scheme. However, despite its success in leading computer chess applications, until now few has been published about this method. The article creates a fundamental basis for this method in the axioms of information theory, then derives the principles used in programming the search and describes mathematically the form of the coefficients. One of the most important parameters of the fractional ply search is derived from fundamental principles. Until now this coefficient has been usually handcrafted...

  2. Low-frequency computational electromagnetics for antenna analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K. (Los Alamos National Lab., NM (USA)); Burke, G.J. (Lawrence Livermore National Lab., CA (USA))

    1991-01-01

    An overview of low-frequency, computational methods for modeling the electromagnetic characteristics of antennas is presented here. The article presents a brief analytical background, and summarizes the essential ingredients of the method of moments, for numerically solving low-frequency antenna problems. Some extensions to the basic models of perfectly conducting objects in free space are also summarized, followed by a consideration of some of the same computational issues that affect model accuracy, efficiency and utility. A variety of representative computations are then presented to illustrate various modeling aspects and capabilities that are currently available. A fairly extensive bibliography is included to suggest further reference material to the reader. 90 refs., 27 figs.

  3. Heat exchanger performance analysis programs for the personal computer

    International Nuclear Information System (INIS)

    Numerous utility industry heat exchange calculations are repetitive and thus lend themselves to being performed on a Personal Computer. These programs may be regarded as engineering tools which, when put together, can form a Toolbox. However, the practicing Results Engineer in the utility industry desires not only programs that are robust as well as easy to use but can also be used both on desktop and laptop PC's. The latter also offer the opportunity to take the computer into the plant or control room, and use it there to process test or operating data right on the spot. Most programs evolve through the needs which arise in the course of day-to-day work. This paper describes several of the more useful programs of this type and outlines some of the guidelines to be followed when designing personal computer programs for use by the practicing Results Engineer

  4. Computational Speed-Up of Complex Durability Analysis of Large-Scale Composite Structures

    Energy Technology Data Exchange (ETDEWEB)

    Storaasli, Olaf O [ORNL; Abdi, Frank [Alpha STAR Corporation; Dutton, Renly [Alpha Star Corporation, Long Beach CA; Cochran, Ernest J [ORNL

    2008-01-01

    The analysis of modern structures for aerospace, infrastructure, and automotive engineering applications necessitates the use of larger and larger computational models for accurate prediction of structural response. The ever-increasing size of computational structural mechanics simulations imposes a pressing need for commensurate increases in computational speed to keep costs and computation times in check. Innovative methods are needed to expedite the numerical analysis of complex structures while minimizing computational costs. The need for these methodologies is even more critical when performing durability and damage tolerance evaluation as the computation is repeated a number of times for various loading conditions. This paper describes a breakthrough for efficient and accurate predictive methodologies that are amenable to the analysis of progressive failure, reliability, and optimization of large-scale composite structures or structural components.

  5. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  6. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  7. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    CERN Document Server

    Öhman, H; The ATLAS collaboration; Hendrix, V

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. With the new cloud technologies come also new challenges, and one such is the contextualization of cloud resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible, which precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration, dynamic resource scaling, and high degree of scalability.

  8. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  9. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Martinez, A Preite; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C; Gargano, G; Magro, R; Santoro, M; Stumbo, S; 10.1088/1748-0221/2/09/P09007

    2009-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  10. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Preite Martinez, A; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C

    2007-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  11. A revista Cais entre o protagonismo e o assistencialismo: Uma análise discursiva crítica

    Directory of Open Access Journals (Sweden)

    Viviane de Melo Resende

    2012-10-01

    Full Text Available Como parte dos resultados de um projeto integrado cujo escopo é investigar, por meio de análises discursivas, as práticas envolvidas na produção e na distribuição de cinco publicações em língua portuguesa voltadas para a situação de rua, este artigo focaliza, com base na Análise de Discurso Crítica, a revista Cais, publicada em Lisboa. Configurando‑se como jornal de rua, a revista é vendida na rua e por pessoas em situação de rua ou de risco, para as quais revertem 70% da venda de cada exemplar. Mais que um meio de comunicação e difusão de problemas sociais, acredita‑se que esse tipo de imprensa proporciona a configuração de posições e relações diferentes, podendo por isso alterar a experiência da exclusão. Neste artigo, tomando como dados excertos de uma entrevista com o seu editor, exploro em que medida se dá a participação de pessoas em situação de rua na produção da revista Cais e na representação desta mesma situação.

  12. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  16. Radiative lifetimes of the A2Π, B2Σ+, C2Π and D2Σ+ states of the CaBr and CaI radicals

    International Nuclear Information System (INIS)

    The lifetimes of the A2Π, B2Σ+, C2Π and D2Σ+ states of the CaBr and CaI radicals have been measured directly from the time rate of fluorescence decay using an appropriate pulsed dye laser as an excitation source. The zero pressure radiative lifetimes determined for the first time for the C2Π1/2 state of CaI and D2Σ+ state of CaBr were (50.5 ± 1.5) ns and (21.0 ± 2.6) ns, respectively. The measured lifetimes agree well with trends predicted by a Ligand Field (L-F) model. For the CaI C2Π3/2 state (τ = 31.2 ns at 1 mbar pressure) and D2Σ+ state (τ < 11 ns), anomalously short lifetimes are observed, indicating that these states are predissociated

  17. Computer program performs statistical analysis for random processes

    Science.gov (United States)

    Newberry, M. H.

    1966-01-01

    Random Vibration Analysis Program /RAVAN/ performs statistical analysis on a number of phenomena associated with flight and captive tests, but can also be used in analyzing data from many other random processes.

  18. Computer analysis with the CEASEMT finite element system

    International Nuclear Information System (INIS)

    This section presents results for the analyses of all three international Piping Benchmark Problems. An inelastic analysis of each problem was performed using a full three-dimensional shell analysis (TRICO code) and a simplified piping analysis based on beam theory (TEDEL code)

  19. Computational Analysis of Solvent Effects in NMR Spectroscopy.

    Science.gov (United States)

    Dračínský, Martin; Bouř, Petr

    2010-01-12

    Solvent modeling became a standard part of first principles computations of molecular properties. However, a universal solvent approach is particularly difficult for the nuclear magnetic resonance (NMR) shielding and spin-spin coupling constants that in part result from collective delocalized properties of the solute and the environment. In this work, bulk and specific solvent effects are discussed on experimental and theoretical model systems comprising solvated alanine zwitterion and chloroform molecules. Density functional theory computations performed on larger clusters indicate that standard dielectric continuum solvent models may not be sufficiently accurate. In some cases, more reasonable NMR parameters were obtained by approximation of the solvent with partial atomic charges. Combined cluster/continuum models yielded the most reasonable values of the spectroscopic parameters, provided that they are dynamically averaged. The roles of solvent polarizability, solvent shell structure, and bulk permeability were investigated. NMR shielding values caused by the macroscopic solvent magnetizability exhibited the slowest convergence with respect to the cluster size. For practical computations, however, inclusion of the first solvation sphere provided satisfactory corrections of the vacuum values. The simulations of chloroform chemical shifts and CH J-coupling constants were found to be very sensitive to the molecular dynamics model used to generate the cluster geometries. The results show that computationally efficient solvent modeling is possible and can reveal fine details of molecular structure, solvation, and dynamics. PMID:26614339

  20. Sensitivity analysis of airport noise using computer simulation

    Directory of Open Access Journals (Sweden)

    Flavio Maldonado Bentes

    2011-09-01

    Full Text Available This paper presents the method to analyze the sensitivity of airport noise using computer simulation with the aid of Integrated Noise Model 7.0. The technique serves to support the selection of alternatives to better control aircraft noise, since it helps identify which areas of the noise curves experienced greater variation from changes in aircraft movements at a particular airport.