WorldWideScience

Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Energy Technology Data Exchange (ETDEWEB)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Computer Assisted Instruction (CAI) in Language Teaching

    Institute of Scientific and Technical Information of China (English)

    Xin; Jing

    2015-01-01

    There are many ways to use computers for English language teaching.First of all,teachers can use them to prepare for classes.They can use a word processing program to write teaching materials and tests.They can use dictionaries,encyclopedias,et c.,available on the computer as resources to help them prepare

  3. Curriculum planning and computer-assisted instruction (CAI) within clinical nursing education.

    OpenAIRE

    Perciful, E. G.

    1992-01-01

    Some experts in nursing and computers have stated that the integration of the computer within nursing education needs to be planned. It has also been declared that there is a need for a body of knowledge that describes the planning and implementing of CAI and the degree of success with the implementation of CAI within nursing education. There is a paucity of literature addressing the planning, implementing, and evaluation of CAI within clinical nursing education. The purpose of this paper is ...

  4. Role of Computer Assisted Instruction (CAI) in an Introductory Computer Concepts Course.

    Science.gov (United States)

    Skudrna, Vincent J.

    1997-01-01

    Discusses the role of computer assisted instruction (CAI) in undergraduate education via a survey of related literature and specific applications. Describes an undergraduate computer concepts course and includes appendices of instructions, flowcharts, programs, sample student work in accounting, COBOL instructional model, decision logic in a…

  5. Effects of Computer-Assisted Instruction (CAI) on 11th Graders' Attitudes to Biology and CAI and Understanding of Reproduction in Plants and Animals.

    Science.gov (United States)

    Soyibo, Kola; Hudson, Ann

    2000-01-01

    Investigates whether the use of the combination of lecture, discussion, and computer-assisted instruction (CAI) significantly improved students' attitudes toward biology and their understanding of reproduction in plants and animals. Studies grade 11 Jamaican female students (n=77) from two traditional high schools in Kingston. (Contains 19…

  6. Effectiveness of Computer Assisted Instructions (CAI) in Teaching of Mathematics at Secondary Level

    Science.gov (United States)

    Dhevakrishnan, R.; Devi, S.; Chinnaiyan, K.

    2012-09-01

    The present study was aimed at effectiveness of computer assisted instructions (CAI) in teaching of mathematics at secondary level adopted experimental method and observing the difference between (CAI) and traditional method. A sample of sixty (60) students of IX class in VVB Matriculation Higher Secondary School at Elayampalayam, Namakkal district were selected for a sample and sample was divided into two group namely experiment and control group. The experimental group consisted 30 students who were taught 'Mensurationí by the computer assisted instructions and the control groups comprising 30 students were taught by the conventional method of teaching. Data analyzed using mean, S.D. and t-test. Findings of the study clearly point out that significant increase in the mean gain scores has been found in the post test scores of the experimental group. Significant differences have been found between the control group and experimental group on post test gain scores. The experiment group, which was taught by the CAI showed better, learning. The conclusion is evident that the CAI is an effective media of instruction for teaching Mathematics at secondary students.s

  7. The Effect of the Computer Assisted Instruction (CAI on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    Directory of Open Access Journals (Sweden)

    Tuğba Hangül

    2010-12-01

    Full Text Available The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade students who were randomly identified in the year of 2009-2010. Attitude was applied to the both groups before and at the end of teaching. The method of constructivism was applied to control the group while CAI was applied to the experiment group. After teaching, fourteen students who were randomly selected from the experimental group were interviewed. Quantitative data was analyzed using Independent Samples t-test and qualitative data was analyzed by description analyze. At the end of the study, the data put forward that teaching through CAI improves the students’ attitudes positively than the method of Constructivism and students have positive opinions on CAI.

  8. An investigative study into the effectiveness of using computer-aided instruction (CAI) as a laboratory component of college-level biology: A case study

    Science.gov (United States)

    Barrett, Joan Beverly

    Community colleges serve the most diverse student populations in higher education. They consist of non-traditional, part-time, older, intermittent, and mobile students of different races, ethnic backgrounds, language preferences, physical and mental abilities, and learning style preferences. Students who are academically challenged may have diverse learning characteristics that are not compatible with the more traditional approaches to the delivery of instruction. With this need come new ways of solving the dilemma, such as Computer-aided Instruction (CAI). This case study investigated the use of CAI as a laboratory component of college-level biology in a small, rural community college setting. The intent was to begin to fill a void that seems to exist in the literature regarding the role of the faculty in the development and use of CAI. In particular, the investigator was seeking to understand the practice and its effectiveness, especially in helping the under prepared student. The case study approach was chosen to examine a specific phenomenon within a single institution. Ethnographic techniques, such as interviewing, documentary analysis, life's experiences, and participant observations were used to collect data about the phenomena being studied. Results showed that the faculty was primarily self-motivated and self-taught in their use of CAI as a teaching and learning tool. The importance of faculty leadership and collegiality was evident. Findings showed the faculty confident that expectations of helping students who have difficulties with mathematical concepts have been met and that CAI is becoming the most valuable of learning tools. In a traditional college classroom, or practice, time is the constant (semesters) and competence is the variable. In the CAI laboratory time became the variable and competence the constant. The use of CAI also eliminated hazardous chemicals that were routinely used in the more traditional lab. Outcomes showed that annual savings

  9. The Effect of the Computer Assisted Instruction (CAI) on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    OpenAIRE

    Tuğba Hangül; Devrim Uzel

    2010-01-01

    The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI) and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade s...

  10. THERMAL HISTORY OF THE CARNIC ALPS (NE ITALY-S. AUSTRIA USING CAI ANALYSIS

    Directory of Open Access Journals (Sweden)

    MONICA PONDRELLI

    2002-11-01

    Full Text Available Thermal patterns of an area which underwent a polyphase deformation history such as the Carnic Alps were analyzed using the Colour Alteration Index (CAI of conodonts in order to constrain some aspects of the metamorphic history of this part of the Southern Alps. Hercynian and alpine tectonothermal events were distinguished using CAI analysis.  The Hercynian event developed temperatures up to low metamorphic conditions. Alpine tectonogenesis did not produce thermal levels in excess of the diagenetic zone. Moreover, CAI patterns allow recognition and evaluation of a hydrothermal metamorphic overprint of Permo-Triassic or Oligocene age that was superimposed on the pre-existing regional metamorphic zonation.   

  11. CAI in English Language Teaching

    Institute of Scientific and Technical Information of China (English)

    吴晓雅

    2009-01-01

    CAI(Computer-Assisted Instruction),a kind of teaching methodology characterized by modern technology,has become a possibility in the application of modem education.CAI in ELT can provide various teaching environments and diverse study patterns which traditional teaching can not achieve.This paper mainly introduces the importance of CAI and discusses its effective use in ELT.As a newly developed technique,CAI is either a challenge or an opportunity for our teachers.If we can grasp this chance properly,our English teaching will surely have a bright future.

  12. The Vibrio cholerae quorum-sensing autoinducer CAI-1: analysis of the biosynthetic enzyme CqsA

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, R.; Bolitho, M; Higgins, D; Lu, W; Ng, W; Jeffrey, P; Rabinowitz, J; Semmelhack, M; Hughson, F; Bassler, B

    2009-01-01

    Vibrio cholerae, the bacterium that causes the disease cholera, controls virulence factor production and biofilm development in response to two extracellular quorum-sensing molecules, called autoinducers. The strongest autoinducer, called CAI-1 (for cholera autoinducer-1), was previously identified as (S)-3-hydroxytridecan-4-one. Biosynthesis of CAI-1 requires the enzyme CqsA. Here, we determine the CqsA reaction mechanism, identify the CqsA substrates as (S)-2-aminobutyrate and decanoyl coenzyme A, and demonstrate that the product of the reaction is 3-aminotridecan-4-one, dubbed amino-CAI-1. CqsA produces amino-CAI-1 by a pyridoxal phosphate-dependent acyl-CoA transferase reaction. Amino-CAI-1 is converted to CAI-1 in a subsequent step via a CqsA-independent mechanism. Consistent with this, we find cells release {ge}100 times more CAI-1 than amino-CAI-1. Nonetheless, V. cholerae responds to amino-CAI-1 as well as CAI-1, whereas other CAI-1 variants do not elicit a quorum-sensing response. Thus, both CAI-1 and amino-CAI-1 have potential as lead molecules in the development of an anticholera treatment.

  13. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Directory of Open Access Journals (Sweden)

    Hsin-Yih Shyu

    1995-09-01

    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  14. In Situ Trace Element Analysis of an Allende Type B1 CAI: EK-459-5-1

    Science.gov (United States)

    Jeffcoat, C. R.; Kerekgyarto, A.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Variations in refractory major and trace element composition of calcium, aluminum-rich inclusions (CAIs) provide constraints on physical and chemical conditions and processes in the earliest stages of the Solar System. Previous work indicates that CAIs have experienced complex histories involving, in many cases, multiple episodes of condensation, evaporation, and partial melting. We have analyzed major and trace element abundances in two core to rim transects of the melilite mantle as well as interior major phases of a Type B1 CAI (EK-459-5-1) from Allende by electron probe micro-analyzer (EPMA) and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) to investigate the behavior of key trace elements with a primary focus on the REEs Tm and Yb.

  15. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Directory of Open Access Journals (Sweden)

    Sunny S. J. Lin

    1988-03-01

    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  16. A Pilot CAI Scheme for the Malaysian Secondary Education System.

    Science.gov (United States)

    Rao, A. Kanakaratnam; Rao, G. S.

    1982-01-01

    A multi-phase computer aided instruction (CAI) scheme for Malaysian Secondary Schools and Matriculation Centres attached to local universities is presented as an aid for improving instruction and for solving some problems presently faced by the Malaysian Secondary Education System. Some approaches for successful implementation of a CAI scheme are…

  17. The Relevance of AI Research to CAI.

    Science.gov (United States)

    Kearsley, Greg P.

    This article provides a tutorial introduction to Artificial Intelligence (AI) research for those involved in Computer Assisted Instruction (CAI). The general theme is that much of the current work in AI, particularly in the areas of natural language understanding systems, rule induction, programming languages, and socratic systems, has important…

  18. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Natural gas diffusion through the cap rock is mainly by means ofdissolving in water, so its concentration can be replaced by solubility, which varies with temperature, pressure and salinity in strata. Under certain geological conditions the maximal solubility is definite, so the diffusion com-putation can be handled approximately by stable state equation. Furthermore, on the basis of the restoration of the paleo-buried history, the diffusion is calculated with the dynamic method, and the result is very close to the real diffusion value in the geological history.

  19. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    FANG; Dequan; (

    2001-01-01

    preferred orientation of experimentally deformed quartzites, Geol. Soc. Am. Bull., 1973, 8: 297.[13]Ramsay, J. G., Huber, M., The Techniques of Modern Structural Geology, Vol. 1, Strain Analysis, New York: Academic Press, 1983, 73-124.[14]Li Shuguang, Ge Ningjie, Liu Deliang et al., The Sm-Nd isotopic age of C-type eclogite from the Dabie group in the northern Dabie mountains and its tectonic implication, Chinese Science Bulletin, 1989, 34(19): 1625.[15]Ye Bodan, Jian Ping, Xu Junwen et al., The Sujiahe Terrene Collage Belt and Its Constitution and Evolution Along the North Hillslope of the Tongbai-Dabie Orogenic Belt (in Chinese), Wuhan: China University of Geosciences Press, 1993, 1-69.[16]Jian Ping, Yan Weiran, Li Zhchang et al., Isotopic geochronological evidence for the Caledonian Xiongdian eclogite in the western Dabie mountains, China, Acta Geologica Sinica (in Chinese), 1997, 71(2): 133.[17]Liu Zhigang, Niu Baogui, Fu Yunlian et al., The tectonostratigraphic units at the northern foot of the Dabie mountains, Regional Geology of China (in Chinese), 1994, 13(1): 246.[18]Zhai Xiaoming, Day, H. W., Hacker, B. R. et al., Paleozoic metamorphism in the Qinling orogen, Tongbai Mountain, central China, Geology, 1998, 26: 371.[19]Li, S., Jagoutz., E., Xiao, Y. et al., Chronology of ultrahigh-pressure metamorphism in the Dabie Mountains and Su-Lu terrene: I. Sm-Nd isotope system, Science in China, Ser. D, 1996, 39(6): 597.[20]Zhang, Z., You, Z., Han, Y. et al., Petrology metamorphic process and genesis of the Dabie-Sulu eclogite belt, east-central China, Acta Geologica Sinica, 1995, 96(2): 306.[21]Cong Bolin, Wang Qingchen, The Dabie-Sulu UHP rocks belt: review and prospect, Chinese Science Bulletin, 1999, 44(12): 1074.[22]Xu Shutong, Jiang laili, Liu Yican et al., Tectonic framework and evolution of the Dabie mountains in Anhui, eastern China, Acta Geologica Sinica (in Chinese), 1992, 66(1): 1.[23]Ren Jishun, Niu Baogui, Liu Zhigang

  20. Study on Teaching Strategies in Mathematics Education based on CAI

    Directory of Open Access Journals (Sweden)

    Wei Yan Feng

    2016-01-01

    Full Text Available With the development of information technology and the popularization of internet, mobile phone, new media represented is gradually influencing and changing people’s study and life, become the centre and social consensus of cultural information, according to the China Internet Network Information centre, the youth is the main use of CAI(Computer Assisted Instruction, which is the most active group of customers, fully understand the impact of the new media environment for students, higher mathematics education of college students in CAI. In this paper, the CAI is proposed for mathematics education of college students.

  1. USING COMPUTERS IN EDUCATION--SOME PROBLEMS AND SOLUTIONS.

    Science.gov (United States)

    SILBERMAN, HARRY F.

    POSSIBLE SOLUTIONS TO THE PROBLEM OF THE DESIGN OF COMPUTER-ASSISTED INSTRUCTION (CAI) PROGRAMS ARE TO COPY EXISTING METHODS, TO USE SCIENTIFIC METHODS, OR TO DESIGN PROGRAMS FITTED TO LOCAL NEEDS. THE BEST ANSWER TO THE PROBLEM OF INSTRUCTIONAL MANAGEMENT SYSTEMS NEEDED FOR CAI PROGRAMS IS COMPUTER ANALYSIS OF STUDENT PERFORMANCE DATA. TRAINING…

  2. The Graphics Terminal Display System; a Powerful General-Purpose CAI Package.

    Science.gov (United States)

    Hornbeck, Frederick W., Brock, Lynn

    The Graphic Terminal Display System (GTDS) was created to support research and development in computer-assisted instruction (CAI). The system uses an IBM 360/50 computer and interfaces with a large-screen graphics display terminal, a random-access slide projector, and a speech synthesizer. An authoring language, GRAIL, was developed for CAI, and…

  3. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  4. The Interplay between Different Forms of CAI and Students' Preferences of Learning Environment in the Secondary Science Class

    Science.gov (United States)

    Chang, Chun-Yen; Tsai, Chin-Chung

    2005-01-01

    This evaluation study investigated the effects of a teacher-centered versus student-centered computer-assisted instruction (CAI) on 10th graders' earth science student learning outcomes. This study also explored whether the effects of different forms of computer-assisted instruction (CAI) on student learning outcomes were influenced by student…

  5. A unified framework for producing CAI melting, Wark-Lovering rims and bowl-shaped CAIs

    Science.gov (United States)

    Liffman, Kurt; Cuello, Nicolas; Paterson, David A.

    2016-10-01

    Calcium-Aluminium inclusions (CAIs) formed in the Solar system, some 4567 million years ago. CAIs are almost always surrounded by Wark-Lovering rims (WLRs), which are a sequence of thin, mono/bi-mineralic layers of refractory minerals, with a total thickness in the range of 1-100 microns. Recently, some CAIs have been found that have tektite-like bowl-shapes. To form such shapes, the CAI must have travelled through a rarefied gas at hypersonic speeds. We show how CAIs may have been ejected from the inner solar accretion disc via the centrifugal interaction between the solar magnetosphere and the inner disc rim. They subsequently punched through the hot, inner disc rim wall at hypersonic speeds. This re-entry heating partially or completely evaporated the CAIs. Such evaporation could have significantly increased the metal abundances of the inner disc rim. High speed movement through the inner disc produced WLRs. To match the observed thickness of WLRs required metal abundances at the inner disc wall that are of order 10 times that of standard solar abundances. The CAIs cooled as they moved away from the protosun, the deduced CAI cooling rates are consistent with the CAI cooling rates obtained from experiment and observation. The speeds and gas densities required to form bowl-shaped CAIs are also consistent with the expected speeds and gas densities for larger, ˜1 cm, CAIs punching through an inner accretion disc wall.

  6. The Matriculation Science Curriculum of the USM in the Context of the PPI and CAI Modes of Instruction.

    Science.gov (United States)

    Cheng, Chuah Chong; Seng, Chin Pin

    1985-01-01

    Discusses philosophy, aims and objectives, and structure of the Matriculation Science Curriculum of the University Sains Malaysia. Includes comments on instructional strategies, individualized learning, programmed instruction, systems approach to computer-assisted instruction (CAI) implementation, CAI authoring system, and various program…

  7. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  8. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  9. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  10. Computational Music Analysis

    OpenAIRE

    2016-01-01

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern r...

  11. CAIs in Semarkona (LL3.0)

    Science.gov (United States)

    Mishra, R. K.; Simon, J. I.; Ross, D. K.; Marhas, K. K.

    2016-01-01

    Calcium, Aluminum-rich inclusions (CAIs) are the first forming solids of the Solar system. Their observed abundance, mean size, and mineralogy vary quite significantly between different groups of chondrites. These differences may reflect the dynamics and distinct cosmochemical conditions present in the region(s) of the protoplanetary disk from which each type likely accreted. Only about 11 such objects have been found in L and LL type while another 57 have been found in H type ordinary chondrites, compared to thousands in carbonaceous chondrites. At issue is whether the rare CAIs contained in ordinary chondrites truly reflect a distinct population from the inclusions commonly found in other chondrite types. Semarkona (LL3.00) (fall, 691 g) is the most pristine chondrite available in our meteorite collection. Here we report petrography and mineralogy of 3 CAIs from Semarkona

  12. A Unified Framework for Producing CAI Melting, Wark-Lovering Rims and Bowl-Shaped CAIs

    CERN Document Server

    Liffman, Kurt; Paterson, David A

    2016-01-01

    Calcium Aluminium Inclusions (CAIs) formed in the Solar System, some 4,567 million years ago. CAIs are almost always surrounded by Wark-Lovering Rims (WLRs), which are a sequence of thin, mono/bi-mineralic layers of refractory minerals, with a total thickness in the range of 1 to 100 microns. Recently, some CAIs have been found that have tektite-like bowl-shapes. To form such shapes, the CAI must have travelled through a rarefied gas at hypersonic speeds. We show how CAIs may have been ejected from the inner solar accretion disc via the centrifugal interaction between the solar magnetosphere and the inner disc rim. They subsequently punched through the hot, inner disc rim wall at hypersonic speeds. This re-entry heating partially or completely evaporated the CAIs. Such evaporation could have significantly increased the metal abundances of the inner disc rim. High speed movement through the inner disc produced WLRs. To match the observed thickness of WLRs required metal abundances at the inner disc wall that a...

  13. New breast cancer prognostic factors identified by computer-aided image analysis of HE stained histopathology images.

    Science.gov (United States)

    Chen, Jia-Mei; Qu, Ai-Ping; Wang, Lin-Wei; Yuan, Jing-Ping; Yang, Fang; Xiang, Qing-Ming; Maskey, Ninu; Yang, Gui-Fang; Liu, Juan; Li, Yan

    2015-05-29

    Computer-aided image analysis (CAI) can help objectively quantify morphologic features of hematoxylin-eosin (HE) histopathology images and provide potentially useful prognostic information on breast cancer. We performed a CAI workflow on 1,150 HE images from 230 patients with invasive ductal carcinoma (IDC) of the breast. We used a pixel-wise support vector machine classifier for tumor nests (TNs)-stroma segmentation, and a marker-controlled watershed algorithm for nuclei segmentation. 730 morphologic parameters were extracted after segmentation, and 12 parameters identified by Kaplan-Meier analysis were significantly associated with 8-year disease free survival (P < 0.05 for all). Moreover, four image features including TNs feature (HR 1.327, 95%CI [1.001-1.759], P = 0.049), TNs cell nuclei feature (HR 0.729, 95%CI [0.537-0.989], P = 0.042), TNs cell density (HR 1.625, 95%CI [1.177-2.244], P = 0.003), and stromal cell structure feature (HR 1.596, 95%CI [1.142-2.229], P = 0.006) were identified by multivariate Cox proportional hazards model to be new independent prognostic factors. The results indicated that CAI can assist the pathologist in extracting prognostic information from HE histopathology images for IDC. The TNs feature, TNs cell nuclei feature, TNs cell density, and stromal cell structure feature could be new prognostic factors.

  14. Effective Computer Aided Instruction in Biomedical Science

    OpenAIRE

    Hause, Lawrence L.

    1985-01-01

    A menu-driven Computer Aided Instruction (CAI) package was integrated with word processing and effectively applied in five curricula at the Medical College of Wisconsin. Integration with word processing facilitates the ease of CAI development by instructors and was found to be the most important step in the development of CAI. CAI modules were developed and are currently used to reinforce lectures in medical pathology, laboratory quality control, computer programming and basic science reviews...

  15. Effectiveness of Cognitive Skills-Based Computer-Assisted Instruction for Students with Disabilities: A Synthesis

    Science.gov (United States)

    Weng, Pei-Lin; Maeda, Yukiko; Bouck, Emily C.

    2014-01-01

    Computer-assisted instruction (CAI) for students with disabilities can be categorized into the following categories: visual, auditory, mobile, and cognitive skills-based CAI. Cognitive-skills based CAI differs from other types of CAI largely in terms of an emphasis on instructional design features. We conducted both systematic review of…

  16. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  17. Computer analysis of railcar vibrations

    Science.gov (United States)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  18. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  19. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  20. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  1. 张采的叙事类散文探析--从“传状”文角度考察%Analysis of Zhang Cai's Narrative Prose From the Perspective of Biography

    Institute of Scientific and Technical Information of China (English)

    曾硕先

    2016-01-01

    Zhang Cai who was one of the two leaders of the Fushe Association , which was the most influential organization in the literary circles in the late Ming Dynasty .The narrative prose composed by Zhang Cai mainly referred to as the traditional biogra -phies of history and ancient prose .In which various literary views can be found since Wang Yangming period , and it advocated the Neo-Confucianism .As for the narrative skills , he was good at exhibiting characters by special environmental description and crea -ting a solemn, serene and simple style .The traditional biographies of history and ancient prose can represent Zhang Cai ’ s prose style.So, we can understand the ancient literary retro -movement of the Fushe Association by investigating them .%作为明末主秉文坛的复社“二张”之一,张采叙事散文的特点于思想内容上,一扫王阳明以来的文坛纷纭,恪守明经倡学,回归程朱理学范畴;在叙事技巧上,善于将人物置身于特别的场景中凸显人物性格,善于营造静穆朴实的文风。作为张采代表性的散文创作文体,“传状”文无疑是洞见复社文学复古运动的一条捷径。

  2. Cai-Li Communication Protocol in Noisy Quantum Channel

    Institute of Scientific and Technical Information of China (English)

    L(U) Hua; YAN Xu-Dong; ZHANG Xian-Zhou

    2004-01-01

    @@ Since the original Cai-Li protocol [Chin. Phys. Lett. 21 (2004) 601] can be used only in an ideal quantum communication, we present the modified Cai-Li protocol that can be used in the a noisy quantum channel by using Calderbank-Shor-Steane (CSS) codes to correct errors. We also give a tight bound on the connection between information Eve eavesdropped with a measurement attack in line B → A and detection probability,which shows that the Cai-Li protocol can be used as a quasisecure direct quantum communication.

  3. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  4. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  5. Computational Aeroacoustic Analysis System Development

    Science.gov (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  6. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  7. Produktový mix firmy Pekařství Cais

    OpenAIRE

    NOVÁKOVÁ, Iveta

    2011-01-01

    The aim of my thesis was to describe product mix in a chosen company. I chose bakery Vladimír Cais in Vlachovo Březí for this work. Another aim was to analyze the product portfolio by means of the Boston Matrix and to propose possible modifications of the product portfolio based on the results. There were also a SWOT analysis and a product life cycle compiled within the analytic part.

  8. A Mathematical Model for Project Planning and Cost Analysis in Computer Assisted Instruction.

    Science.gov (United States)

    Fitzgerald, William F.

    Computer-assisted instruction (CAI) has become sufficiently widespread to require attention to the relationships between its costs, administration and benefits. Despite difficulties in instituting them, quantifiable cost-effectiveness analyses offer several advantages. They allow educators to specify with precision anticipated instructional loads,…

  9. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  10. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  11. Purification and Activity of Antibacterial Substances Derived from Soil Streptomyces sp.CaiF1

    Institute of Scientific and Technical Information of China (English)

    Hui YANG; Guixiang PENG; Jianmin ZENG; Zhiyuan TAN

    2012-01-01

    [Objective] This study aimed to separate and purify antibacterial sub- stances from soil Streptomyces sp. CaiF1, and to explore the activities of this sub- stance. [Method] The antibacterial substances were separated and purified by Ethyl acetate extraction, macroporous adsorptive resin, silica gel chromatography and preparative high performance liquid chromatography (HPLC), and powdery mildew were taken as the indicating bacterial to study their activities. [Result] Antibacterial substances were purified and the stability analysis of the extracts from Streptomyces CaiF1 fermentation broth showed very stable at pH 2.0-pH 10.0, 100 ~C and changed very little under UV treatment for 24 h. Inhibition rate of powdery mildew was 69.7%. [Conclusion] The purified antibacterial substances showed good stability, which provided theoretical foundation for their structural identifications and future ap- plications.

  12. Forensic Analysis of Compromised Computers

    Science.gov (United States)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  13. COMPUTER ASSISTED INSTRUCTION AND ITS APPLICATION IN ENGLISH LEARNING

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    This paper briefly describes the development of computer assisted instruction(CAI) abroad and in China, lists the advantages of CAI and deals with its application in English learning. Some suggestions about how to make better use of CAI in ELT are also given.

  14. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  15. Relationship between Pre-Service Music Teachers' Personality and Motivation for Computer-Assisted Instruction

    Science.gov (United States)

    Perkmen, Serkan; Cevik, Beste

    2010-01-01

    The main purpose of this study was to examine the relationship between pre-service music teachers' personalities and their motivation for computer-assisted music instruction (CAI). The "Big Five" Model of Personality served as the framework. Participants were 83 pre-service music teachers in Turkey. Correlation analysis revealed that three…

  16. Prof. Cai Shuming Receives 2005 Wetland Conservation Award

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ Prof. Cai Shuming, an expert in wetland studies from the CAS Institute of Geodesy & Geophysics, has been honored with a Ramsar Wetland Conservation Award in 2005. The announcement was made by the Standing Committee of the Ramsar Convention on June 10 in Gland,Switzerland.

  17. Phenotypic diversity and correlation between white-opaque switching and the CAI microsatellite locus in Candida albicans.

    Science.gov (United States)

    Hu, Jian; Guan, Guobo; Dai, Yu; Tao, Li; Zhang, Jianzhong; Li, Houmin; Huang, Guanghua

    2016-08-01

    Candida albicans is a commensal fungal pathogen that is often found as part of the human microbial flora. The aim of the present study was to establish a relationship between diverse genotypes and phenotypes of clinical isolates of C. albicans. Totally 231 clinical isolates were collected and used for genotyping and phenotypic switching analysis. Based on the microsatellite locus (CAI) genotyping assay, 65 different genotypes were identified, and some dominant types were found in certain human niches. For example, the genotypes of 30-44 and 30-45 were enriched in vaginal infection samples. C. albicans has a number of morphological forms including the single-celled yeasts, multicellular filaments, white, and opaque cell types. The relationship between the CAI genotype and the ability to undergo phenotypic switching was examined in the clinical isolates. We found that the strains with longer CAA/G repeats in both alleles of the CAI locus were more opaque competent. We also discovered that some MTL heterozygous (a/alpha) isolates could undergo white-opaque switching when grown on regular culture medium (containing glucose as the sole carbon source). Our study establishes a link between phenotypic switching and genotypes of the CAI microsatellite locus in clinical isolates of C. albicans. PMID:26832141

  18. Characterization of Meteorites by Focused Ion Beam Sectioning: Recent Applications to CAIs and Primitive Meteorite Matrices

    Science.gov (United States)

    Christoffersen, Roy; Keller, Lindsay P.; Han, Jangmi; Rahman, Zia; Berger, Eve L.

    2015-01-01

    Focused ion beam (FIB) sectioning has revolutionized preparation of meteorite samples for characterization by analytical transmission electron microscopy (TEM) and other techniques. Although FIB is not "non-destructive" in the purest sense, each extracted section amounts to no more than nanograms (approximately 500 cubic microns) removed intact from locations precisely controlled by SEM imaging and analysis. Physical alteration of surrounding material by ion damage, fracture or sputter contamination effects is localized to within a few micrometers around the lift-out point. This leaves adjacent material intact for coordinate geochemical analysis by SIMS, microdrill extraction/TIMS and other techniques. After lift out, FIB sections can be quantitatively analyzed by electron microprobe prior to final thinning, synchrotron x-ray techniques, and by the full range of state-of-the-art analytical field-emission scanning transmission electron microscope (FE-STEM) techniques once thinning is complete. Multiple meteorite studies supported by FIB/FE-STEM are currently underway at NASA-JSC, including coordinated analysis of refractory phase assemblages in CAIs and fine-grained matrices in carbonaceous chondrites. FIB sectioning of CAIs has uncovered epitaxial and other overgrowth relations between corundum-hibonite-spinel consistent with hibonite preceding corundum and/or spinel in non-equilibrium condensation sequences at combinations of higher gas pressures, dust-gas enrichments or significant nebular transport. For all of these cases, the ability of FIB to allow for coordination with spatially-associated isotopic data by SIMS provides immense value for constraining the formation scenarios of the particular CAI assemblage. For carbonaceous chondrites matrix material, FIB has allowed us to obtain intact continuous sections of the immediate outer surface of Murchison (CM2) after it has been experimentally ion processed to simulate solar wind space weathering. The surface

  19. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  20. Computational methods for global/local analysis

    Science.gov (United States)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  1. Adjustment computations spatial data analysis

    CERN Document Server

    Ghilani, Charles D

    2011-01-01

    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  2. Error Analysis In Computational Elastodynamics

    Science.gov (United States)

    Mukherjee, Somenath; Jafarali, P.; Prathap, Gangan

    The Finite Element Method (FEM) is the mathematical tool of the engineers and scientists to determine approximate solutions, in a discretised sense, of the concerned differential equations, which are not always amenable to closed form solutions. In this presentation, the mathematical aspects of this powerful computational tool as applied to the field of elastodynamics have been highlighted, using the first principles of virtual work and energy conservation.

  3. 电子商务参与下的农产品供应链渠道分析--以“菜管家”为例%An Analysis of the Channels in Agricultural Supply Chain Integrated with E-commerce-Evidence from“Cai Guan Jia”

    Institute of Scientific and Technical Information of China (English)

    王珂; 李震; 周建

    2014-01-01

    为了分析电子商务的参与对于农产品流通的影响,供应链网络均衡理论被引入到农产品供应链分析当中。文章首先简要介绍了“菜管家”农产品电子商务平台的运营现状,并重点分析了在“菜管家”参与下电子商务线上渠道与传统线下渠道并存的农产品供应链模式,在此基础上利用供应链网络均衡理论分析了电子商务的加入对于农产品供应链参与各方的影响,并以实例验证了电子商务渠道能够显著降低农产品流通成本,促进农业产业化发展。%In order to analyze the impact of e-commerce on the distribution of agricultural products, the paper introduces the equilibrium theory of supply chain network to the analysis of agricultural supply chain. First,it makes a brief introduc⁃tion on the current situation of operation of the agricultural e-commerce platform named “Cai Guan Jia”. Then it focuses on the analyses of the agricultural supply chain of “Cai Guan Jia” based on both e-commence online channels and tradi⁃tional offline channels. Following that, it applies the equilibrium theory of supply chain network to study the impact of the participation of e-commerce on each partner in the chain. Finally,the paper presentes numerical examples to illustrate that e-commerce channels can effectively reduce the distribution cost of agricultural products and promote the industrialization of agriculture.

  4. Computable Analysis with Applications to Dynamic Systems

    NARCIS (Netherlands)

    Collins, P.J.

    2010-01-01

    In this article we develop a theory of computation for continuous mathematics. The theory is based on earlier developments of computable analysis, especially that of the school of Weihrauch, and is presented as a model of intuitionistic type theory. Every effort has been made to keep the presentatio

  5. Silicon Isotopic Fractionation of CAI-like Vacuum Evaporation Residues

    Energy Technology Data Exchange (ETDEWEB)

    Knight, K; Kita, N; Mendybaev, R; Richter, F; Davis, A; Valley, J

    2009-06-18

    Calcium-, aluminum-rich inclusions (CAIs) are often enriched in the heavy isotopes of magnesium and silicon relative to bulk solar system materials. It is likely that these isotopic enrichments resulted from evaporative mass loss of magnesium and silicon from early solar system condensates while they were molten during one or more high-temperature reheating events. Quantitative interpretation of these enrichments requires laboratory determinations of the evaporation kinetics and associated isotopic fractionation effects for these elements. The experimental data for the kinetics of evaporation of magnesium and silicon and the evaporative isotopic fractionation of magnesium is reasonably complete for Type B CAI liquids (Richter et al., 2002, 2007a). However, the isotopic fractionation factor for silicon evaporating from such liquids has not been as extensively studied. Here we report new ion microprobe silicon isotopic measurements of residual glass from partial evaporation of Type B CAI liquids into vacuum. The silicon isotopic fractionation is reported as a kinetic fractionation factor, {alpha}{sub Si}, corresponding to the ratio of the silicon isotopic composition of the evaporation flux to that of the residual silicate liquid. For CAI-like melts, we find that {alpha}{sub Si} = 0.98985 {+-} 0.00044 (2{sigma}) for {sup 29}Si/{sup 28}Si with no resolvable variation with temperature over the temperature range of the experiments, 1600-1900 C. This value is different from what has been reported for evaporation of liquid Mg{sub 2}SiO{sub 4} (Davis et al., 1990) and of a melt with CI chondritic proportions of the major elements (Wang et al., 2001). There appears to be some compositional control on {alpha}{sub Si}, whereas no compositional effects have been reported for {alpha}{sub Mg}. We use the values of {alpha}Si and {alpha}Mg, to calculate the chemical compositions of the unevaporated precursors of a number of isotopically fractionated CAIs from CV chondrites whose

  6. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  7. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  8. The ethnoecology of Caiçara metapopulations (Atlantic Forest, Brazil): ecological concepts and questions

    OpenAIRE

    Begossi Alpina

    2006-01-01

    Abstract The Atlantic Forest is represented on the coast of Brazil by approximately 7,5% of remnants, much of these concentrated on the country's SE coast. Within these southeastern remnants, we still find the coastal Caiçaras who descend from Native Indians and Portuguese Colonizers. The maintenance of such populations, and their existence in spite of the deforestation that occurred on the Atlantic Forest coast, deserves especial attention and analysis. In this study, I address, in particula...

  9. The Effects of Computer-Assisted Instruction Based on Top-Level Structure Method in English Reading and Writing Abilities of Thai EFL Students

    Science.gov (United States)

    Jinajai, Nattapong; Rattanavich, Saowalak

    2015-01-01

    This research aims to study the development of ninth grade students' reading and writing abilities and interests in learning English taught through computer-assisted instruction (CAI) based on the top-level structure (TLS) method. An experimental group time series design was used, and the data was analyzed by multivariate analysis of variance…

  10. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  11. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  12. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  13. The Intelligent CAI System for Chemistry Based on Automated Reasoning

    Institute of Scientific and Technical Information of China (English)

    王晓京; 张景中

    1999-01-01

    A new type of intelligent CAI system for chemistry is developed in this paper based on automated reasoning with chemistry knowledge.The system has shown its ability to solve chemistry problems,to assist students and teachers in studies and instruction with the automated reasoning functions.Its open mode of the knowledge base and its unique style of the interface between the system and human provide more opportunities for the users to acquire living knowledge through active participation.The automated reasoning based on basic chemistry knowledge also opened a new approach to the information storage and management of the ICAI system for sciences.

  14. Computer aided nonlinear electrical networks analysis

    Science.gov (United States)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  15. Computer Language Effciency via Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Andrea Ellero

    2011-01-01

    Full Text Available The selection of the computer language to adopt is usually driven by intuition and expertise, since it is very diffcult to compare languages taking into account all their characteristics. In this paper, we analyze the effciency of programming languages through Data Envelopment Analysis. We collected the input data from The Computer Language Benchmarks Game: we consider a large set of languages in terms of computational time, memory usage, and source code size. Various benchmark problems are tackled. We analyze the results first of all considering programming languages individually. Then, we evaluate families of them sharing some characteristics, for example, being compiled or interpreted.

  16. The computer in shell stability analysis

    Science.gov (United States)

    Almroth, B. O.; Starnes, J. H., Jr.

    1975-01-01

    Some examples in which the high-speed computer has been used to improve the static stability analysis capability for general shells are examined. The fundamental concepts of static stability are reviewed with emphasis on the differences between linear bifurcation buckling and nonlinear collapse. The analysis is limited to the stability of conservative systems. Three examples are considered. The problem of cylinders subjected to bending loads is used as an example to illustrate that a simple structure can have a sufficiently complicated nonlinear behavior to require a computer analysis for accurate results. An analysis of the problems involved in the modeling of stiffening elements in plate and shell structures illustrates the necessity that the analyst recognizes all important deformation modes. The stability analysis of the Skylab structure indicates the size of problems that can be solved with current state-of-the-art capability.

  17. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  18. Beyond Word Processing: Rhetorical Invention with Computers.

    Science.gov (United States)

    Strickland, James

    In the area of composition, computer assisted instruction (CAI) must move beyond the limited concerns of the current-traditional rhetoric to address the larger issues of writing, become process-centered, and involve active writing rather than answering multiple-choice questions. Researchers cite four major types of interactive CAI, the last of…

  19. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  20. CAI in New York City: Report on the First Year's Operations

    Science.gov (United States)

    Butler, Cornelius F.

    1969-01-01

    "The nation's largest CAI operation in a public school system concluded its first full year of operation in June, 1969. The results indicate a very definite success for education's most closely watched use of technology. Three major criteria for success of such a project are 1) acceptance of CAI by the schools and their pupils, 2) per pupil costs…

  1. Structural basis of Na+-independent and cooperative substrate/product antiport in CaiT

    NARCIS (Netherlands)

    Schulze, Sabrina; Köster, Stefan; Geldmacher, Ulrike; Terwisscha van Scheltinga, Anke C.; Kühlbrandt, Werner

    2010-01-01

    Transport of solutes across biological membranes is performed by specialized secondary transport proteins in the lipid bilayer, and is essential for life. Here we report the structures of the sodium-independent carnitine/butyrobetaine antiporter CaiT from Proteus mirabilis (PmCaiT) at 2.3-Å and from

  2. Brief Introduction to the Foundation of CAI Shidong Award for Plasma Physics

    Institute of Scientific and Technical Information of China (English)

    SHENG Zhengming

    2010-01-01

    @@ The late Academician Professor CAI Shidong was an outstanding plasma physicist who had made seminal contributions in both fundamental plasma theories and controlled thermonuclear fusion energy research.Professor CAI was also one of the pioneers in China's plasma physics research.In 1973,Professor CAI decided to leave U.S.and return to China in order to help pushing forward plasma physics research in China.Professor CAI formed a research group consisting of young scientists and carried out high-level works in this important physics discipline.He worked tirelessly,set examples by his own deeds,and made outstanding contributions in plasma physics research,educating younger generations of plasma physicists,as well as establishing collaborations with plasma scientists in other Asian-African developing nations.In short,Professor CAI devoted the best years of his life to China's plasma physics research.

  3. Risk analysis enhancement via computer applications

    International Nuclear Information System (INIS)

    Since the development of Reliability Centered Maintenance (RCM) by the airline industry, there has been various alternative approaches to applying this methodology to the nuclear power industry. Some of the alternatives were developed in order to shift the focus of analyses on plant specific concerns but the greatest majority of alternatives were developed in attempt to reduce the effort required to conduct a RCM analysis on as large of scale as a nuclear power station. Computer applications have not only reduced the amount of analysis time but have also produced more consistent results, provided an effective working RCM analysis tool and made it possible to automate a Living Program. During the development of a RCM Program at South Carolina Electric and Gas' V.C. Summer Nuclear Station (VCSNS), computer applications were developed. 6 figs, 1 tab

  4. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...... on information obtained from software profiling and the resulting design is validated through cosimulation. The achieved speed-up is estimated based on an analysis of profiling information from different sets of input data and various architectural options....

  5. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  6. The impact of computer-based interactive instruction (CBII) in improving the teaching-learning process in introductory college physics

    Science.gov (United States)

    Jawad, Afif A.

    Institutes are incorporating computer-assisted instruction (CAI) into their classrooms in an effort to enhance learning. The implementation of computers into the classroom is parallel with education's role of keeping abreast with societal demands. The number of microcomputers in schools has increased tremendously. Computer Based Interactive Instruction (CBBI) software is available for the language arts, mathematics, science, social studies, etc. The traditional instruction, supplemented with CAI, seems to be more effective than traditional instruction alone. Although there is a large quantity of research regarding specific aspects of learning through computers, there seems to be a lack of information regarding the impact of computers upon student success. The goal of this study is to determine how much of CAI is implemented in higher education in the USA. Instructors from 38 states were surveyed to compare between the institutes that use Computer Based Interactive Instruction and the ones that do not and are still applying traditional delivery method. Based on the analysis of the data gathered during this study, it is concluded that the majority of instructors are now using computers in one form or another. This study has determined that the computer is a major component in the teaching of introductory physics, and therefore, may be a suitable substitute for the traditional delivery system. Computers as an instructional delivery system are an alternative that may result in a higher level of student learning for many higher education courses.

  7. Probabilistic structural analysis computer code (NESSUS)

    Science.gov (United States)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  8. Computed tomographic analysis of urinary calculi

    Energy Technology Data Exchange (ETDEWEB)

    Newhouse, J.H.; Prien, E.L.; Amis, E.S. Jr.; Dretler, S.P.; Pfister, R.C.

    1984-03-01

    Excised urinary calculi were subjected to computed tomographic (CT) scanning in an attempt to determine whether CT attenuation values would allow accurate analysis of stone composition. The mean, maximum, and modal pixel densities of the calculi were recorded and compared; the resulting values reflected considerable heterogeneity in stone density. Although uric acid and cystine calculi could be identified by their discrete ranges on one or more of these criteria, calcium-containing stones of various compositions, including struvite, could not be distinguished reliably. CT analysis of stone density is not likely to be more accurate than standard radiography in characterizing stone composition in vivo.

  9. How CAI Correctly Play a Role in Teaching%如何在教学中正确发挥CAI的作用

    Institute of Scientific and Technical Information of China (English)

    苏醒

    2011-01-01

    CAI (Computer-assisted Instruction) refers to transmit the information in teaching using computer and computer technology to complete the task of teaching and achieve educational purposes. CAI can integrate animation, sound, text etc. together, which not only be help to teaching, but also help students form new ideas, new concepts and new methods in the learning process, and is a powerful tool and form of developing student potential and developing their intelligence and ability. However, in actual use there were many problems. The foUowing is my view about the role of CAI in the classroom teaching, according to my personal experience in teaching.%CAI(计算机辅助教学)是指在教学活动中,利用计算机及其技术传导教学过程中的信息,完成教学任务,达到教学目的.利用CAI能使动画、声音,文字等地切入融为一体的特点,不仅有利于教学,更有利于学生在学习过程中形成新思想、新观念和新方法,是开发学生潜能,发展学生智力和能力的有力工具和形式.但是在实际使用过程中又出现了许多的问题,以下是我根据个人教学经验谈谈CAI在课堂教学中应用到底发挥多大作用.

  10. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    to create realistic traffic profiles of the selected applications, which can server as the training data for MLAs. We assessed the usefulness of C5.0 Machine Learning Algorithm (MLA) in the classification of computer network traffic. We showed that the application-layer payload is not needed to train the C5...... various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks......Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models...

  11. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  12. CAI课件在《家畜寄生虫学》教学中的应用%The use of CAI courseware in veterinary parasitology teaching

    Institute of Scientific and Technical Information of China (English)

    王建民; 姚龙泉; 刘明春; 何剑斌; 葛云侠

    2012-01-01

    通过多种途径收集素材,制备适合动物医学专业学生使用的《家畜寄生虫学》CAI课件,使原来枯燥的讲解课程变成生动的展示课程.该课件帮助同学们日后对寄生虫病诊断以及寄生虫分类奠定了良好基础.%Collecting materials by diverseness ways, and preparation computer assisted instruction ( CAI) courseware of animal parasitology which was suit for animal medicine undergraduate students using. The CAI of animal parasitology had turned the original boring lectures into vivid display courses. The courseware would establish a satisfactory basis for students diagnosis and classification parasitosis in the future work.

  13. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  14. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  15. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  16. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  17. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  18. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    Directory of Open Access Journals (Sweden)

    V. V. Postnov

    2014-07-01

    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  19. Experimental analysis of computer system dependability

    Science.gov (United States)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  20. Good relationships between computational image analysis and radiological physics

    International Nuclear Information System (INIS)

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics

  1. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  2. Design of CAI Courseware Based on Virtual Reality Mechanism%基于VR机制的CAI课件设计

    Institute of Scientific and Technical Information of China (English)

    管群

    2001-01-01

    In this paper,the application feature and significance of VR technology in the educational field are summarized.In particular,the design mechanism of CAI courseware of the instruction aiming at individuals is studied,and with virtual reality mechanism a learning-while-doing environment is realized for the user in the CAI courseware in the major of the computer application.The design theory,the technique way,some of the algorithm flowchart and the interface of the exercise of operation are given.%论述了虚拟现实技术在教育领域中的应用特点和重要意义。特别研究了针对个别化教学的CAI课件设计机制,并运用虚拟现实机制在计算机应用类CAI课件设计中实现了一个可供用户边学边做的学习环境。给出了设计原理、技术路线、部分算法流程和操作练习界面。

  3. Formation of refractory metal nuggets and their link to the history of CAIs

    Science.gov (United States)

    Schwander, D.; Kööp, L.; Berg, T.; Schönhense, G.; Heck, P. R.; Davis, A. M.; Ott, U.

    2015-11-01

    Ca, Al-rich inclusions (CAIs) often contain numerous refractory metal nuggets (RMNs), consisting of elements like Os, Ir, Mo, Pt and Ru. The nuggets are usually thought to have formed by equilibrium condensation from a gas of solar composition, simultaneously with or prior to oxide and silicate minerals. However, the exact mechanisms responsible for their extremely variable compositions, small sizes and associations with CAI minerals remain puzzling. Expanding on previous work on chemically separated RMNs, we have studied a large number of RMNs within their host CAIs from three different meteorite types, i.e., the highly primitive chondrite Acfer 094 (C2-ungrouped), Allende (CV3ox) and Murchison (CM2). Our results show several inconsistencies between the observed features and a direct condensation origin, including a lack of correlated abundance variations in the refractory metals that are expected from variations in condensation temperature. Instead, we show that most RMN features are consistent with RMN formation by precipitation from a CAI liquid enriched in refractory metals. This scenario is additionally supported by the common occurrence of RMNs in CAIs with clear melt crystallization textures as well as the occurrence of synthetic RMNs with highly variable compositions in run products from Schwander et al. (2015). In some cases, the sizes of meteoritic RMNs correlate with the sizes of their host minerals in CAIs, which indicates common cooling rates.

  4. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  5. Review of Computational Stirling Analysis Methods

    Science.gov (United States)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  6. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    Numerous illicit activities happen in our society, which, from time to time affect the population by harming individuals directly or indirectly. Researchers from different disciplines have contributed to developing strategies to analyze such activities, in order to help law enforcement agents dev...... population globally sensitive to specific world issues. The models discuss the dynamics of population in response to such issues. All the models presented in the thesis can be combined for a systematic analysis of illicit activities.......Numerous illicit activities happen in our society, which, from time to time affect the population by harming individuals directly or indirectly. Researchers from different disciplines have contributed to developing strategies to analyze such activities, in order to help law enforcement agents...... devise policies to minimize them. These activities include cybercrimes, terrorist attacks or violent actions in response to certain world issues. Beside such activities, there are several other related activities worth analyzing, for which computational models have been presented in this thesis...

  7. Computed tomographic analysis of renal calculi

    Energy Technology Data Exchange (ETDEWEB)

    Hillman, B.J.; Drach, G.W.; Tracey, P.; Gaines, J.A.

    1984-03-01

    An in vitro study sought to determine the feasibility of using computed tomography (CT) to analyze the chemical composition of renal calculi and thus aid in selecting the best treatment method. Sixty-three coded calculi were scanned in a water bath. Region-of-interest measurements provided the mean, standard deviation, and minimum and maximum pixel values for each stone. These parameters were correlated with aspects of the stones' chemical composition. A multivariate analysis showed that the mean and standard deviation of the stones' pixel values were the best CT parameters for differentiating types of renal calculi. By using computerized mapping techniques, uric acid calculi could be perfectly differentiated from struvite and calcium oxalate calculi. The latter two types also were differentiable, but to a lesser extent. CT has a potential role as an adjunct to clinical and laboratory methods for determining the chemical composition of renal calculi in an effort to select optimal treatment.

  8. Computational based functional analysis of Bacillus phytases.

    Science.gov (United States)

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  9. The Anatomy and Bulk Composition of CAI Rims in the Vigarano (CV3) Chondrite

    Science.gov (United States)

    Ruzicka, A.; Boynton, W. V.

    1993-07-01

    A striking feature of Ca,Al-rich inclusions (CAIs) in chondrites is the presence of mineralogical layers that typically form rim sequences up to 50 micrometers thick [1]. Many ideas regarding the origin of CAI rims have been proposed, but none are entirely satisfactory. The detailed mineralogy and bulk compositions of relatively unaltered CAI rims in the Vigarano (CV3) chondrite described here provide constraints on hypotheses of rim formation. Rim Mineralogy: CAIs in Vigarano consist of melilite (mel)- and spinel (sp)- rich varieties, both of which are rimmed [2]. Around mel-rich objects, the layer sequence is CAI interior --> sp-rich layer (sometimes absent) --> mel/anorthite (anor) layer --> Ti-Al-rich clinopyroxene (Tpx) layer --> Al- diopside (Al-diop) layer --> olivine (ol) +/- Al-diop layer --> host matrix. The sequence around sp-rich objects differs from this in that the mel/anor layer is absent. Both the sp-rich layer around mel-cored CAIs and the cores of sp-rich CAIs in Vigarano are largely comprised of a fine-grained (anor layer is sometimes monomineralic, consisting of mel alone, or bimineralic, consisting of both mel and anor. Where bimineralic, anor typically occurs in the outer part of the layer. In places, anor (An(sub)99-100) has partially altered to nepheline and voids. Rim mel is systematically less gehlenitic than mel in the CAI interiors, especially compared to mel in the interior adjacent to the rims. The Tpx layer (>2 and up to 15 wt% TiO2) and Al-diop layer ( sp + fo --> sp + fo + anor or mel or Tpx) that does not correspond to observed rim sequences. It thus appears that (1) the rim region did not form through crystallization of molten CAIs; and (2) rim layers did not originate solely by the crystallization of a melt layer present on a solid CAI core [4,5]. References: [1] Wark D. A. and Lovering J. F. (1977) Proc. LSC 8th, 95-112. [2] Ruzicka A. and Boynton W. V. (1991) Meteoritics, 26, 390-391. [3] Stolper E. (1982) GCA, 46, 2159

  10. 计算机辅助教与学(CAI/CAL)在电磁学教学改革中的应用%CAI and CAL in the Teaching Reform of Electromagnetism

    Institute of Scientific and Technical Information of China (English)

    骆斌

    2001-01-01

    阐述计算机辅助教与学(CAI/CAL)在教学改革中的重要性,论述了CAI/CAL的基本功能以及在电磁学教学改革中的主要应用,认为在21世纪,教学中引入CAI/CAL是实现教育现代化的重要手段.

  11. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  12. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  13. Computer Assisted Laboratory Instructions: Learning Outcomes Analysis

    OpenAIRE

    Abdulrasool, Salah Mahdi; Mishra, Rakesh

    2006-01-01

    For this students in mechanical engineering subject area were exposed to computer assisted instructions to satisfy following learning outcomes in computer aided design/computer aided manufacturing module. i- Creation of drawing and design using Computer aided design ii- Using data exchange format (DXF) to create numerical control file iii- Final setup check of computerised numerical control machine iv- Final manufacturing of the product using CNC v- e ytilauQ valuation The t...

  14. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  15. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare. PMID:21893777

  16. A computational design system for rapid CFD analysis

    Science.gov (United States)

    Ascoli, E. P.; Barson, S. L.; Decroix, M. E.; Sindir, Munir M.

    1992-01-01

    A computation design system (CDS) is described in which these tools are integrated in a modular fashion. This CDS ties together four key areas of computational analysis: description of geometry; grid generation; computational codes; and postprocessing. Integration of improved computational fluid dynamics (CFD) analysis tools through integration with the CDS has made a significant positive impact in the use of CFD for engineering design problems. Complex geometries are now analyzed on a frequent basis and with greater ease.

  17. Schlieren sequence analysis using computer vision

    Science.gov (United States)

    Smith, Nathanial Timothy

    Computer vision-based methods are proposed for extraction and measurement of flow structures of interest in schlieren video. As schlieren data has increased with faster frame rates, we are faced with thousands of images to analyze. This presents an opportunity to study global flow structures over time that may not be evident from surface measurements. A degree of automation is desirable to extract flow structures and features to give information on their behavior through the sequence. Using an interdisciplinary approach, the analysis of large schlieren data is recast as a computer vision problem. The double-cone schlieren sequence is used as a testbed for the methodology; it is unique in that it contains 5,000 images, complex phenomena, and is feature rich. Oblique structures such as shock waves and shear layers are common in schlieren images. A vision-based methodology is used to provide an estimate of oblique structure angles through the unsteady sequence. The methodology has been applied to a complex flowfield with multiple shocks. A converged detection success rate between 94% and 97% for these structures is obtained. The modified curvature scale space is used to define features at salient points on shock contours. A challenge in developing methods for feature extraction in schlieren images is the reconciliation of existing techniques with features of interest to an aerodynamicist. Domain-specific knowledge of physics must therefore be incorporated into the definition and detection phases. Known location and physically possible structure representations form a knowledge base that provides a unique feature definition and extraction. Model tip location and the motion of a shock intersection across several thousand frames are identified, localized, and tracked. Images are parsed into physically meaningful labels using segmentation. Using this representation, it is shown that in the double-cone flowfield, the dominant unsteady motion is associated with large scale

  18. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  19. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  20. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  1. The Use of Modular Computer-Based Lessons in a Modification of the Classical Introductory Course in Organic Chemistry.

    Science.gov (United States)

    Stotter, Philip L.; Culp, George H.

    An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…

  2. Mineralogy and Petrology of EK-459-5-1, A Type B1 CAI from Allende

    Science.gov (United States)

    Jeffcoat, C. R.; Kerekgyarto, A. G.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2015-01-01

    Calcium-aluminum-rich inclusions (CAIs) are a type of coarse-grained clast composed of Ca-, Al-, and Mg-rich silicates and oxides found in chondrite meteorites. Type B (CAIs) are exclusively found in the CV chondrite meteorites and are the most well studied type of inclusion found in chondritic meteorites. Type B1 CAIs are distinguished by a nearly monomineralic rim of melilite that surrounds an interior predominantly composed of melilite, fassaite (Ti and Al-rich clinopyroxene), anorthite, and spinel with varying amounts of other minor primary and secondary phases. The formation of Type B CAIs has received considerable attention in the course of CAI research and quantitative models, experimental results and observations from Type B inclusions remain largely in disagreement. Recent experimental results and quantitative models have shown that the formation of B1 mantles could have occurred by the evaporative loss of Si and Mg during the crystallization of these objects. However, comparative studies suggest that the lower bulk SiO2 compositions in B1s result in more prior melilite crystallization before the onset of fassaite and anorthite crystallization leading to the formation of thick melilite rich rims in B1 inclusions. Detailed petrographic and cosmochemical studies of these inclusions will further our understanding of these complex objects.

  3. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  4. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  5. [Computational genome analysis of three marine algoviruses].

    Science.gov (United States)

    Stepanova, O A; Boĭko, A L; Shcherbatenko, I S

    2013-01-01

    Computational analysis of genomic sequences of three new marine algoviruses: Tetraselmis viridis virus (TvV-S20 and TvV-SI1 strains) and Dunaliella viridis virus (DvV-SI2 strain) was conducted. Both considerable similarity and essential distinctions between studied strains and the most studied marine algoviruses of Phycodnaviridae family were revealed. Our data show that the tested strains are new viruses with the following features: only they were isolated from marine eukaryotic microalgae T. viridis and D. viridis, coding sequences (CDSs) of their genomes are localized mainly on one of the DNA strands and form several clusters with short intergenic spaces; there are considerable variations in genome structure within viruses and their strains; viral genomic DNA has a high GC-content (55.5 - 67.4%); their genes contain no well-known optimal contexts of translation start codones, and the contexts of terminal codons read-through; the vast majority of viral genes and proteins do not have any matches in gene banks. PMID:24479317

  6. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  7. Simplified computer codes for cask impact analysis

    International Nuclear Information System (INIS)

    In regard to the evaluation of the acceleration and deformation of casks, the simplified computer codes make analyses economical and decrease input and calculation time. The results obtained by the simplified computer codes have enough adequacy for their practical use. (J.P.N.)

  8. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  9. A Comparative Analysis of Computer Literacy Education for Nurses

    OpenAIRE

    Hardin, Richard C.; Skiba, Diane J.

    1982-01-01

    Despite recent advances by nursing in the computer field computer literacy is a rarity among nursing professionals. Our analysis of existing educational models in nursing (baccalaureate, staff development, continuing education, and vendor) shows that no single educational strategy is likely to be effective in achieving computer literacy for all nurses. A refinement of the computer literacy concept is proposed which divides the educational needs of nurses into specific objectives based on desi...

  10. Introduction to numerical analysis and scientific computing

    CERN Document Server

    Nassif, Nabil

    2013-01-01

    Computer Number Systems and Floating Point Arithmetic Introduction Conversion from Base 10 to Base 2Conversion from Base 2 to Base 10Normalized Floating Point SystemsFloating Point OperationsComputing in a Floating Point SystemFinding Roots of Real Single-Valued Functions Introduction How to Locate the Roots of a Function The Bisection Method Newton's Method The Secant MethodSolving Systems of Linear Equations by Gaussian Elimination Mathematical Preliminaries Computer Storage for Matrices. Data Structures Back Substitution for Upper Triangular Systems Gauss Reduction LU DecompositionPolynomia

  11. Development and Evaluation of Computer Assisted Instruction for Navy Electronics Training. Two, Inductance.

    Science.gov (United States)

    Hurlock, Richard E.

    A computer-assisted instruction (CAI) curriculum module covering the area of electrical inductance was developed and evaluated. This module was a part of a program in which a series of CAI modules are being developed and tested for a Navy training course in basic electronics. After the module was written, it was given three tryout tests.…

  12. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Science.gov (United States)

    Garrett, T. J.; And Others

    1987-01-01

    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  13. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  14. The Effects of Trait Anxiety and Dogmatism on State Anxiety During Computer-Assisted Learning.

    Science.gov (United States)

    Rappaport, Edward

    In this study of the interaction between anxiety trait (A-trait), anxiety state (A-state), and dogmatism in computer-assisted instruction (CAI), subjects were selected on the basis of extreme scores on a measure of anxiety and on a measure of dogmatism. The subjects were presented with a CAI task consisting of difficult mathematical problems. The…

  15. Benefits of Computer-Assisted Instruction to Support Reading Acquisition in English Language Learners

    Science.gov (United States)

    Macaruso, Paul; Rodman, Alyson

    2011-01-01

    Young children who are English language learners (ELLs) face major challenges in learning to read English. This study examined whether computer-assisted instruction (CAI) can be beneficial to ELL kindergartners enrolled in bilingual classes. The CAI programs provided systematic and structured exercises in developing phonological awareness and…

  16. An Evaluation of Computer-Aided Instruction in an Introductory Biostatistics Course.

    Science.gov (United States)

    Forsythe, Alan B.; Freed, James R.

    1979-01-01

    Evaluates the effectiveness of computer assisted instruction for teaching biostatistics to first year students at the UCLA School of Dentistry. Results do not demonstrate the superiority of CAI but do suggest that CAI compares favorably to conventional lecture and programed instruction methods. (RAO)

  17. Critical Thinking Outcomes of Computer-Assisted Instruction versus Written Nursing Process.

    Science.gov (United States)

    Saucier, Bonnie L.; Stevens, Kathleen R.; Williams, Gail B.

    2000-01-01

    Nursing students (n=43) who used clinical case studies via computer-assisted instruction (CAI) were compared with 37 who used the written nursing process (WNP). California Critical Thinking Skills Test results did not show significant increases in critical thinking. The WNP method was more time consuming; the CAI group was more satisfied. Use of…

  18. ANACROM - A computer code for chromatogram analysis

    International Nuclear Information System (INIS)

    The computer code was developed for automatic research of peaks and evaluation of chromatogram parameters as : center, height, area, medium - height width (FWHM) and the rate FWHM/center of each peak. (Author)

  19. Behavior computing modeling, analysis, mining and decision

    CERN Document Server

    2012-01-01

    Includes six case studies on behavior applications Presents new techniques for capturing behavior characteristics in social media First dedicated source of references for the theory and applications of behavior informatics and behavior computing

  20. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  1. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Science.gov (United States)

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  2. Revision of the Oriental leafhopper genus Destinoides Cai & He (Hemiptera: Cicadellidae: Ledrinae), with a new synonym and two new combinations.

    Science.gov (United States)

    Sun, Jing; Webb, Michael D; Zhang, Yalin

    2014-01-01

    The leafhopper genus Destinoides Cai & He is revised to include two species D. latifrons (Walker 1851, Ledra) n. comb. and D. conspicuus (Distant 1907, Petalocephala) n. comb. Destinoides fasciata Cai & He, 2000 is placed as a junior synonym of D. latifrons, syn. nov. These two species are redescribed and illustrated in detail and a key is given based on the males.

  3. The Analysis of Some Contemporary Computer Mikrosystems

    Directory of Open Access Journals (Sweden)

    Angelė Kaulakienė

    2011-04-01

    Full Text Available In every language a twofold process could be observed: 1 a huge surge of new terms and 2 a big part of these new terms make their way into the common language. The nucleus of the vocabulary and the grammatical system of the common language make the essence of a language and its national originality. Because of such an intensive development in the future terminological lexis can become a basis of a common language and it ought to be not a spontaneously formed sum of terminological lexis, but an entirety of consciously created terms, which meet the requirements of language, logic and terminology. Computer terminology, by comparison with terminology of other fields, is being created in a slightly unusual way. The first computation institutions in Lithuania were established in early sixties and a decade later there were a few computation centres and a number of key-operated and punch machines working. Together with the new computational technology many new devices, units, parts, phenomena and characteristics appeared, which needed naming. Specialists faced an obvious shortage of Lithuanian terms for computing equipment. In 1971 this gap was partly filled by „Rusų-lietuvių-anglų kalbų skaičiavimo technikos žodynas“ (Russian-Lithuanian-English dictionary of computing equipment, which for a long time (for more than 20 years was the only one terminological dictionary of this field. Only during nineties a few dictionaries of different scope appeared. Computer terminology from ten dictionaries, which are presently available, shows that 35 year period of computer terminology is a stage of its creation, the main features of which are reasonable synonymy (when both international term are being used to name the concept and variability. Such state of Lithuanian computer terminology is predetermined by some linguistic, interlinguistic and sociolinguistic factors. At present in Lithuania terminological dictionaries of various fields are being given to

  4. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  5. Wing analysis using a transonic potential flow computational method

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  6. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  7. Cloud Computing for Rigorous Coupled-Wave Analysis

    Directory of Open Access Journals (Sweden)

    N. L. Kazanskiy

    2012-01-01

    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  8. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  9. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  10. The Role of the CAI-1 Fatty Acid Tail in the Vibrio cholerae Quorum Sensing Response

    Science.gov (United States)

    Perez, Lark J.; Ng, Wai-Leung; Marano, Paul; Brook, Karolina; Bassler, Bonnie L.; Semmelhack, Martin F.

    2013-01-01

    Quorum sensing is a mechanism of chemical communication among bacteria that enables collective behaviors. In V. cholerae, the etiological agent of the disease cholera, quorum sensing controls group behaviors including virulence factor production and biofilm formation. The major V. cholerae quorum-sensing system consists of the extracellular signal molecule called CAI-1 and its cognate membrane bound receptor called CqsS. Here, the ligand binding activity of CqsS is probed with structural analogs of the natural signal. Enabled by our discovery of a structurally simplified analog of CAI-1, we prepared and analyzed a focused library. The molecules were designed to probe the effects of conformational and structural changes along the length of the fatty acid tail of CAI-1. Our results, combined with pharmacophore modeling, suggest a molecular basis for signal molecule recognition and receptor fidelity with respect to the fatty acid tail portion of CAI-1. These efforts provide novel probes to enhance discovery of anti-virulence agents for the treatment of V. cholerae. PMID:23092313

  11. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    DEFF Research Database (Denmark)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.;

    2014-01-01

    We present a detailed characterization of the mineralogy, petrology, and oxygen isotopic compositions of twelve FUN CAIs, including C1 and EK1-4-1 from Allende (CV), that were previously shown to have large isotopic fractionation patterns for magnesium and oxygen, and large isotopic anomalies of ...

  12. Computational thermo-fluid analysis of a disk brake

    Science.gov (United States)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. High-Throughput Proteomic Approaches to the Elucidation of Potential Biomarkers of Chronic Allograft Injury (CAI

    Directory of Open Access Journals (Sweden)

    Hilary Cassidy

    2013-09-01

    Full Text Available This review focuses on the role of OMICs technologies, concentrating in particular on proteomics, in biomarker discovery in chronic allograft injury (CAI. CAI is the second most prevalent cause of allograft dysfunction and loss in the first decade post-transplantation, after death with functioning graft (DWFG. The term CAI, sometimes referred to as chronic allograft nephropathy (CAN, describes the deterioration of renal allograft function and structure as a result of immunological processes (chronic antibody-mediated rejection, and other non-immunological factors such as calcineurin inhibitor (CNI induced nephrotoxicity, hypertension and infection. Current methods for assessing allograft function are costly, insensitive and invasive; traditional kidney function measurements such as serum creatinine and glomerular filtration rate (GFR display poor predictive abilities, while the current “gold-standard” involving histological diagnosis with a renal biopsy presents its own inherent risks to the overall health of the allograft. As early as two years post-transplantation, protocol biopsies have shown more than 50% of allograft recipients have mild CAN; ten years post-transplantation more than 50% of the allograft recipients have progressed to severe CAN which is associated with diminishing graft function. Thus, there is a growing medical requirement for minimally invasive biomarkers capable of identifying the early stages of the disease which would allow for timely intervention. Proteomics involves the study of the expression, localization, function and interaction of the proteome. Proteomic technologies may be powerful tools used to identify novel biomarkers which would predict CAI in susceptible individuals. In this paper we will review the use of proteomics in the elucidation of novel predictive biomarkers of CAI in clinical, animal and in vitro studies.

  15. The Effect of Computer Assisted Instruction on Elementary Reading and Writing Achievement

    Directory of Open Access Journals (Sweden)

    H. Gülhan ORHAN KARSAK

    2014-01-01

    Full Text Available The research investigated the effect of computer assisted instruction (CAI on elementary reading and writing achievement (ERWA. The sample consisted of 64 first graders (32 in the experimental group and 32 in the control group in the 2006-2007 academic year. This quasi-experimental study had a posttest only control group design and was conducted during the first semester. The experimental group was taught by CAI and the control group was taught by traditional instruction. Data were gathered through ‘Parent Questionnaire’, ‘Reading Concepts Scale’, ‘Achievement Test’, ‘Reading and Handwriting Observation Form’ and analyzed by chi-square, frequency and t test through SPSS 12.0. The main findings of the study were as follows: (1 CAI affected first graders’ handwriting, reading fluency and punctuation, (2 CAI didn’t affect their writing and reading comprehension, (3 CAI affected ERWA of those who did not have any computer at home.

  16. Computational Music Structure Analysis (Dagstuhl Seminar 16092)

    OpenAIRE

    Müller, Meinard; Chew, Elaine; Bello, Juan Pablo

    2016-01-01

    Music is a ubiquitous and vital part of the lives of billions of people worldwide. Musical creations and performances are among the most complex and intricate of our cultural artifacts, and the emotional power of music can touch us in surprising and profound ways. In view of the rapid and sustained growth of digital music sharing and distribution, the development of computational methods to help users find and organize music information has become an important field of research in both indust...

  17. Computational analysis of viscoelastic free surface flows

    OpenAIRE

    Edussuriya, Suchitra Samanthi

    2003-01-01

    The demand for increasingly small and lightweight products require micro-scale components made of materials which are durable and light. Polymers have therefore become a popular choice since they can be used to produce materials which meet industrial requirements. Many of these polymers are viscoelastic fluids. The reduction in the sizes of components make physical experimentation difficult and costly. Therefore computational tools are being sought to replace old methods of testing. This ...

  18. Analysis of computed tomography of ovarian tumor

    Energy Technology Data Exchange (ETDEWEB)

    Omura, Makoto; Taniike, Keiko; Nishiguchi, Hiroyasu

    1987-07-01

    One hundred and twenty six patients with ovarian mass were studied with computed tomography (CT) and classified into five groups according to its margin and inner structure. The incidence of malignancy of cystic ovarian mass with smooth margin was low and that of solid ovarian mass with irreglar margin was high. Three cases (6.7 %) of malignant ovarian tumor demonstrated completely cystic pattern. Ovarian teratomas contained well defined component of fat density.

  19. Computer-aided Analysis of Phisiological Systems

    OpenAIRE

    Balázs Benyó

    2007-01-01

    This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and ...

  20. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  1. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  2. Computational Neural Networks: A New Paradigm for Spatial Analysis

    OpenAIRE

    Fischer, M.M.

    1996-01-01

    In this paper a systematic introduction to computational neural network models is given in order to help spatial analysts learn about this exciting new field. The power of computational neural networks viz-à-viz conventional modelling is illustrated for an application field with noisy data of limited record length: spatial interaction modelling of telecommunication data in Austria. The computational appeal of neural networks for solving some fundamental spatial analysis problems is summarized...

  3. Proceedings Seventh International Conference on Computability and Complexity in Analysis

    CERN Document Server

    Zheng, Xizhong; 10.4204/EPTCS.24

    2010-01-01

    This volume of the Electronic Proceedings in Theoretical Computer Science (EPTCS) contains extended abstracts of talks to be presented at the Seventh International Conference on Computability and Complexity in Analysis (CCA 2010) that will take place in Zhenjiang, China, June 21-25, 2010. This conference is the seventeenth event in the series of CCA annual meetings. The CCA conferences are aimed at promoting the study and advancement of the theory of computability and complexity over real-valued data and its application.

  4. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  5. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  6. The Utility of Computer-Assisted Power Analysis Lab Instruction

    Science.gov (United States)

    Petrocelli, John V.

    2007-01-01

    Undergraduate students (N = 47), enrolled in 2 separate psychology research methods classes, evaluated a power analysis lab demonstration and homework assignment. Students attended 1 of 2 lectures that included a basic introduction to power analysis and sample size analysis. One lecture included a demonstration of how to use a computer-based power…

  7. Analysis of service-oriented computing systems

    OpenAIRE

    Ivanovic, Dragan

    2013-01-01

    La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), co...

  8. Computer-aided Analysis of Phisiological Systems

    Directory of Open Access Journals (Sweden)

    Balázs Benyó

    2007-12-01

    Full Text Available This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and stateclassification to detect brain ischemia by means of EEG signal processing; Detection ofbreathing disorders like apnea and hypopnea; Molecular biology studies with DNA-chips;Evaluation of the cry of normal hearing and hard of hearing infants.

  9. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  10. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  11. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  12. Process for computing geometric perturbations for probabilistic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fitch, Simeon H. K. (Charlottesville, VA); Riha, David S. (San Antonio, TX); Thacker, Ben H. (San Antonio, TX)

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  13. Atomic physics: computer calculations and theoretical analysis

    OpenAIRE

    Drukarev, E. G.

    2004-01-01

    It is demonstrated, how the theoretical analysis preceding the numerical calculations helps to calculate the energy of the ground state of helium atom, and enables to avoid qualitative errors in the calculations of the characteristics of the double photoionization.

  14. Two computer programs for the analysis of marine magnetic data

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, M.M.M.; Lakshminarayana, S.; Murthy, K.S.R.; Subrahmanyam, A.S.

    stream_size 37575 stream_content_type text/plain stream_name Comput_Geosci_19_657.pdf.txt stream_source_info Comput_Geosci_19_657.pdf.txt Content-Encoding UTF-8 Content-Type text/plain; charset=UTF-8 Computers & Geosciem...'es Vol. 19, No. 5, pp. 657-672, 1993 0098-3004/93 $6.00 + 0.00 Printed in Great Britain. All rights reserved Copyright (' 1993 Pergamon Press Ltd TWO COMPUTER PROGRAMS FOR THE ANALYSIS OF MARINE MAGNETIC DATA M. M. MALLESWARA RAO, S...

  15. Interactive Computer Lessons for Introductory Economics: Guided Inquiry-From Supply and Demand to Women in the Economy.

    Science.gov (United States)

    Miller, John; Weil, Gordon

    1986-01-01

    The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)

  16. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  17. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  19. Computer teaching process optimization strategy analysis of thinking ability

    Directory of Open Access Journals (Sweden)

    Luo Liang

    2016-01-01

    Full Text Available As is known to all, computer is a college student in a university course, one of the basic course in the process of education for college students which lay a theoretical foundation for the next professional learning. At the same time, in recent years, countries and universities attach great importance to and focus on computer teaching for young college students, the purpose is to improve students’ thinking ability, eventually to promote college students’ ability to use computational thinking to solve and analyze the problems of daily life. Therefore, this article on how to the calculation of optimization in the process of computer teaching college students thinking ability on further discussion and analysis, and then explore the strategies and methods, so as to promote the computer teaching in the process of the cultivation of thinking ability and optimize the computer

  20. COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS

    OpenAIRE

    2010-01-01

    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  1. Computational Analysis of Safety Injection Tank Performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Oan; Nietiadia, Yohanes Setiawan; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of); Addad, Yacine; Yoon, Ho Joon [Khalifa University of Science Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-10-15

    The APR 1400 is a large pressurized water reactor (PWR). Just like many other water reactors, it has an emergency core cooling system (ECCS). One of the most important components in the ECCS is the safety injection tank (SIT). Inside the SIT, a fluidic device is installed, which passively controls the mass flow of the safety injection and eliminates the need for low pressure safety injection pumps. As more passive safety mechanisms are being pursued, it has become more important to understand flow structure and the loss mechanism within the fluidic device. Current computational fluid dynamics (CFD) calculations have had limited success in predicting the fluid flow accurately. This study proposes to find a more exact result using CFD and more realistic modeling. The SIT of APR1400 was analyzed using MARS and CFD. CFD calculation was executed first to obtain the form loss factor. Using the two form loss factors from the vendor and calculation, calculation using MARS was performed to compare with experiment. The accumulator model in MARS was quite accurate in predicting the water level. The pipe model showed some difference with the experimental data in the water level.

  2. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  3. Local spatial frequency analysis for computer vision

    Science.gov (United States)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  4. Adaptive computational methods for aerothermal heating analysis

    Science.gov (United States)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  5. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    International Nuclear Information System (INIS)

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  6. Formation of Refractory Metal Alloys and Their Occurrence in CAIs

    Science.gov (United States)

    Schwander, D.; Berg, T.; Ott, U.; Schönhense, G.; Palme, H.

    2012-09-01

    At the conference we will give an overview of the current state of our research of RMN from Murchison, Allende and Acfer 094 including statistical analysis of their compositions and structures in relation to condensation calculations.

  7. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    have become the standard with which to assess emphysema extent but airway abnormalities have so far been more challenging to quantify. Automated methods for analysis are indispensable as the visible airway tree in a CT scan can include several hundreds of individual branches. However, automation...... of scan on airway dimensions in subjects with and without COPD. The results show measured airway dimensions to be affected by differences in the level of inspiration and this dependency is again influenced by COPD. Inspiration level should therefore be accounted for when measuring airways, and airway...

  8. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  9. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  10. Limited subsolidus diffusion in type B1 CAI: Evidence from Ti distribution in spinel

    Science.gov (United States)

    Meeker, G. P.; Quick, J. E.; Paque, Julie M.

    1993-01-01

    Most models of calcium aluminum-rich inclusions (CAI) have focused on early stages of formation by equilibrium crystallization of a homogeneous liquid. Less is known about the subsolidus cooling history of CAI. Chemical and isotopic heterogeneties on a scale of tens to hundreds of micrometers (e.g. MacPherson et al. (1989) and Podosek, et al. (1991)) suggest fairly rapid cooling with a minimum of subsolidus diffusion. However, transmission electron microscopy indicates that solid state diffusion may have been an important process at a smaller scale (Barber et al. 1984). If so, chemical evidence for diffusion could provide constraints on cooling times and temperatures. With this in mind, we have begun an investigation of the Ti distribution in spinels from two type B1 CAI from Allende to determine if post-crystallization diffusion was a significant process. The type B1 CAIs, 3529Z and 5241 have been described by Podosek et al. (1991) and by El Goresy et al. (1985) and MacPherson et al. (1989). We have analyzed spinels in these inclusions using the electron microprobe. These spinels are generally euhedral, range in size from less than 10 to 15 micron and are poikilitically enclosed by millimeter-sized pyroxene, melilite, and anorthite. Analyses were obtained from both the mantles and cores of the inclusions. Compositions of pyroxene in the vicinity of individual spinel grains were obtained by analyzing at least two points on opposite sides of the spinel and averaging the compositions. The pyroxene analyses were obtained within 15 microns of the spinel-pyroxene interface. No compositional gradients were observed within single spinel crystals. Ti concentrations in spinels included within pyroxene, melilite, and anorthite are presented.

  11. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  12. A computer analysis of the Schreber Memoirs.

    Science.gov (United States)

    Klein, R H

    1976-06-01

    With the aid of a computerized system for content analysis, WORDS, the complete Schreber Memoirs was subjected to various multivariate reduction techniques in order to investigate the major content themes of this document. The findings included the prevalence of somatic concerns throughout the Memoirs, clear references to persecutory ideas and to Schreber's assumption of a redemptive role, complex encapsulated concerns about Schreber's relationship with God, a lack of any close relationship between sexuality and sexual transformation either to themes of castration or procreation, and the fact that neither sun, God, nor Flechsig was significantly associated with clusters concerning gender, sexuality, or castration. These findings are discussed in relation to psychodynamic interpretations furnished by prior investigators who employed different research methods.

  13. Numerical investigation of CAI Combustion in the Opposed- Piston Engine with Direct and Indirect Water Injection

    Science.gov (United States)

    Pyszczek, R.; Mazuro, P.; Teodorczyk, A.

    2016-09-01

    This paper is focused on the CAI combustion control in a turbocharged 2-stroke Opposed-Piston (OP) engine. The barrel type OP engine arrangement is of particular interest for the authors because of its robust design, high mechanical efficiency and relatively easy incorporation of a Variable Compression Ratio (VCR). The other advantage of such design is that combustion chamber is formed between two moving pistons - there is no additional cylinder head to be cooled which directly results in an increased thermal efficiency. Furthermore, engine operation in a Controlled Auto-Ignition (CAI) mode at high compression ratios (CR) raises a possibility of reaching even higher efficiencies and very low emissions. In order to control CAI combustion such measures as VCR and water injection were considered for indirect ignition timing control. Numerical simulations of the scavenging and combustion processes were performed with the 3D CFD multipurpose AVL Fire solver. Numerous cases were calculated with different engine compression ratios and different amounts of directly and indirectly injected water. The influence of the VCR and water injection on the ignition timing and engine performance was determined and their application in the real engine was discussed.

  14. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    OpenAIRE

    Braun, Klaus; Frank, Martin; Pipkorn, Rüdiger; Reed, Jennifer; Spring, Herbert; Debus, Jürgen; Didinger, Bernd; von der Lieth, Claus-Wilhelm; Wiessler, Manfred; Waldeck, Waldemar

    2008-01-01

    The Human immunodeficiency virus 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal las...

  15. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    OpenAIRE

    Klaus Braun, Martin Frank, Rüdiger Pipkorn, Jennifer Reed, Herbert Spring, Jürgen Debus, Bernd Didinger, Claus-Wilhelm von der Lieth, Manfred Wiessler, Waldemar Waldeck

    2008-01-01

    The Human immunodeficiency 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser sca...

  16. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil)

    OpenAIRE

    Alves Rômulo RN; Hanazaki Natalia; Begossi Alpina

    2009-01-01

    Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers), with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 nati...

  17. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  18. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  19. Benefits of Computer Based Content Analysis to Foresight

    OpenAIRE

    Kováříková, Ludmila; Grosová, Stanislava

    2014-01-01

    Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of the content analysis for the foresight projects are discussed as well. Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis within the foresight, results of the generation phase of a particular foresight project perf...

  20. Structural Analysis Using Computer Based Methods

    Science.gov (United States)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  1. Computer programs for analysis of geophysical data

    International Nuclear Information System (INIS)

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  2. Computer-Assisted Learning Design for Reflective Practice Supporting Multiple Learning Styles for Education and Training in Pre-Hospital Emergency Care.

    Science.gov (United States)

    Jones, Indra; Cookson, John

    2001-01-01

    Students in paramedic education used a model combining computer-assisted instruction (CAI), reflective practice, and learning styles. Although reflective practice normally requires teacher-student interaction, CAI with reflective practice embedded enabled students to develop learning style competencies and achieve curricular outcomes. (SK)

  3. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  4. Large-scale temporal analysis of computer and information science

    Science.gov (United States)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  5. Parallel computation of seismic analysis of high arch dam

    Institute of Scientific and Technical Information of China (English)

    Chen Houqun; Ma Huaifa; Tu Jin; Cheng Guangqing; Tang Juzhen

    2008-01-01

    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.

  6. COMPUTING

    CERN Document Server

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  8. First Experiences with LHC Grid Computing and Distributed Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  9. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  10. Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection

    Science.gov (United States)

    Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok

    2011-11-01

    Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.

  11. Rigorous computer analysis of the Chow-Robbins game

    CERN Document Server

    Häggström, Olle

    2012-01-01

    Flip a coin repeatedly, and stop whenever you want. Your payoff is the proportion of heads, and you wish to maximize this payoff in expectation. This so-called Chow-Robbins game is amenable to computer analysis, but while simple-minded number crunching can show that it is best to continue in a given position, establishing rigorously that stopping is optimal seems at first sight to require "backward induction from infinity". We establish a simple upper bound on the expected payoff in a given position, allowing efficient and rigorous computer analysis of positions early in the game. In particular we confirm that with 5 heads and 3 tails, stopping is optimal.

  12. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  13. COMPUTING

    CERN Document Server

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  14. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. 羽毛球男子双打运动员蔡赟/傅海峰、郑在成/李龙大技术运用对比分析%Comparative Analysis Of The Technique Used By Men's Badminton Doubles Players Cai Yun/Fu Haifeng,Zheng Zaicheng/Li Long Da

    Institute of Scientific and Technical Information of China (English)

    黄卓

    2012-01-01

    This article through the methods of video observation,mathematical statistics and so on,made comparative studies of the technology characteristics of six international badminton tournaments in men's doubles players in China(Cai Yun / Fu Haifeng) and South Korean men's doubles players(Zheng Zaicheng / Lee Yong Dae) in 2011.The research shows that an opponent has been directed at Fu Haifeng killed again assigned tactical preparation,reducing the pick ball technology use,weakened Fu Haifeng 's attacking threat;net drop both averaged differ from small,notably in the net twisting put the ball technology,net ball quality,net ball circuit and placement changes,and predict each other.Zheng Zaicheng's defensive consciousness is strong,drive fast ball speed,pitch changes,landing.%通过录像观察法、数理统计法等对2011所进行的6项国际羽毛球赛事中中国男双运动员(蔡斌贝/傅海峰)和韩国男双运动员(郑在成/李龙大)的技术特点进行对比研究,研究显示:对手已经针对傅海峰的重杀布置了技战术准备,减少了挑球等技术的使用,减弱了付海峰的进攻威胁;网前球双方场均得分相差距不大,值得注意的是在网前搓放球技术的运用中,网前球质量,网前球线路和落点的变化,以及预判对方球路的能力上都不如对手;郑在成/李龙大防守反击的意识强,平抽快挡球的速度快、落点刁、球路变化多。

  16. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  17. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  18. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

    Science.gov (United States)

    Kantak, A.

    1994-01-01

    In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

  19. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  20. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in process data apparently are available. PMID:24274405

  1. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  2. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  3. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  4. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  5. Interactive computer code for dynamic and soil structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mulliken, J.S.

    1995-12-01

    A new interactive computer code is presented in this paper for dynamic and soil-structure interaction (SSI) analyses. The computer program FETA (Finite Element Transient Analysis) is a self contained interactive graphics environment for IBM-PC`s that is used for the development of structural and soil models as well as post-processing dynamic analysis output. Full 3-D isometric views of the soil-structure system, animation of displacements, frequency and time domain responses at nodes, and response spectra are all graphically available simply by pointing and clicking with a mouse. FETA`s finite element solver performs 2-D and 3-D frequency and time domain soil-structure interaction analyses. The solver can be directly accessed from the graphical interface on a PC, or run on a number of other computer platforms.

  6. Finite element dynamic analysis on CDC STAR-100 computer

    Science.gov (United States)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  8. Computational Methods for the Analysis of Array Comparative Genomic Hybridization

    Directory of Open Access Journals (Sweden)

    Raj Chari

    2006-01-01

    Full Text Available Array comparative genomic hybridization (array CGH is a technique for assaying the copy number status of cancer genomes. The widespread use of this technology has lead to a rapid accumulation of high throughput data, which in turn has prompted the development of computational strategies for the analysis of array CGH data. Here we explain the principles behind array image processing, data visualization and genomic profile analysis, review currently available software packages, and raise considerations for future software development.

  9. Qualitative Research and Computer Analysis: New Challenges and Opportunities

    OpenAIRE

    Yuen, AHK

    2000-01-01

    The use of computers for Qualitative Data Analysis (QDA) in qualitative research has been growing rapidly in the last decade. QDA programs are software packages developed explicitly for the purpose of analyzing qualitative data. A range of different kinds of program is available for the handling and analysis of qualitative data, such as Atlas/ti, HyperRESEARCH, and NUD*IST. With the development of new technologies, the QDA software has advanced from the efficient code-and-retrieve ability to ...

  10. Computer automated movement detection for the analysis of behavior

    OpenAIRE

    Ramazani, Roseanna B.; Harish R Krishnan; BERGESON, SUSAN E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtractio...

  11. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  12. The NASA NASTRAN structural analysis computer program - New content

    Science.gov (United States)

    Weidman, D. J.

    1978-01-01

    Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.

  13. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, Vladimir

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  14. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  15. Interactive computer system for analysis of dynamic renal studies

    International Nuclear Information System (INIS)

    An interactive computer system is described for a small minicomputer to be used in the evaluation of radionuclide scintiscanning studies of renal transplants and other dynamic kidney function studies. The package consists of programs for data acquisition, analysis, and report generation. As an added feature, the program dissociates the kidney view into total kidney, cortical, and medullar components

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  17. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discusses the structure, performance and applications of the system

  18. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    Science.gov (United States)

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  19. Computational analysis of Ebolavirus data: prospects, promises and challenges.

    Science.gov (United States)

    Michaelis, Martin; Rossman, Jeremy S; Wass, Mark N

    2016-08-15

    The ongoing Ebola virus (also known as Zaire ebolavirus, a member of the Ebolavirus family) outbreak in West Africa has so far resulted in >28000 confirmed cases compared with previous Ebolavirus outbreaks that affected a maximum of a few hundred individuals. Hence, Ebolaviruses impose a much greater threat than we may have expected (or hoped). An improved understanding of the virus biology is essential to develop therapeutic and preventive measures and to be better prepared for future outbreaks by members of the Ebolavirus family. Computational investigations can complement wet laboratory research for biosafety level 4 pathogens such as Ebolaviruses for which the wet experimental capacities are limited due to a small number of appropriate containment laboratories. During the current West Africa outbreak, sequence data from many Ebola virus genomes became available providing a rich resource for computational analysis. Here, we consider the studies that have already reported on the computational analysis of these data. A range of properties have been investigated including Ebolavirus evolution and pathogenicity, prediction of micro RNAs and identification of Ebolavirus specific signatures. However, the accuracy of the results remains to be confirmed by wet laboratory experiments. Therefore, communication and exchange between computational and wet laboratory researchers is necessary to make maximum use of computational analyses and to iteratively improve these approaches. PMID:27528741

  20. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  1. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  2. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  3. Computer Analysis Of ILO Standard Chest Radiographs Of Pneumoconiosis

    Science.gov (United States)

    Li, C. C.; Shu, David B. C.; Tai, H. T.; Hou, W.; Kunkle, G. A.; Wang, Y.; Hoy, R. J.

    1982-11-01

    This paper presents study of computer analysis of the 1980 ILO standard chest radiographs of pneumoconiosis. Algorithms developed for detection of individual small rounded and irregular opacities have been experimented and evaluated on these standard radiographs. The density, shape, and size distribution of the detected objects in the lung field, in spite of false positives, can be used as indicators for the beginning of pneumoconiosis. This approach is potentially useful in computer-assisted screening and early detection process where the annual chest radiograph of each worker is compared with his (her) own normal radiograph obtained previously.

  4. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  5. CAR: A MATLAB Package to Compute Correspondence Analysis with Rotations

    Directory of Open Access Journals (Sweden)

    Urbano Lorenzo-Seva Rovira

    2009-09-01

    Full Text Available Correspondence analysis (CA is a popular method that can be used to analyse relationships between categorical variables. Like principal component analysis, CA solutions can be rotated both orthogonally and obliquely to simple structure without affecting the total amount of explained inertia. We describe a MATLAB package for computing CA. The package includes orthogonal and oblique rotation of axes. It is designed not only for advanced users of MATLAB but also for beginners. Analysis can be done using a user-friendly interface, or by using command lines. We illustrate the use of CAR with one example.

  6. Design and Implement Method of Discrete Mathematics CAI Teaching Component%离散数学CAI课件的设计和实现方法

    Institute of Scientific and Technical Information of China (English)

    闫浮; 岳利明

    2001-01-01

    离散数学虽然是计算机专业课的基础课程,但是抽象难懂,为了加强大家对这门课程的理解,本文作者开发了离散数学的辅助教学软件。在这篇文章中主要从离散数学教学软件的课件层次结构出发讨论辅助教学软件的自适应性。%Discrete mathematics is a basic course in computer teaching purpose, yet it is abstract and difficult to be understood. The discrete mathematics CAI software is developed to help the more comprehension on this course. The article mainly discusses the software self-adapting performance from the view of hierarchical structure of the teaching component.

  7. The Clinical Experiences of Dr.CAI Gan in Treating Chronic Constipation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zheng-li; ZHU Mei-ping; LIU Qun; LEI Yun-xia

    2009-01-01

    @@ Prof.CAI Gan (蔡淦) is an academic leader in TCM treatment of the spleen and stomach disease.He insisted that liver depression, spleen deficiency and poor nourishment of the intestines are the core of pathogenesis for chronic constipation.Therefore he often treats the disease by strengthening the spleen,relieving the depressed liver, nourishing yin and moistening the intestines.Meanwhile he attaches great importance to syndrome differentiation and comprehensive regulation and treatment.As a result,good therapeutic effects are often achieved.The authors summarized his ways for treating chronic constipation with the following 10 methods, which are introduced below.

  8. HIV-1 Capsid Assembly Inhibitor (CAI Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    Directory of Open Access Journals (Sweden)

    Klaus Braun, Martin Frank, Rüdiger Pipkorn, Jennifer Reed, Herbert Spring, Jürgen Debus, Bernd Didinger, Claus-Wilhelm von der Lieth, Manfred Wiessler, Waldemar Waldeck

    2008-01-01

    Full Text Available The Human immunodeficiency 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser scanning microscopy (CLSM, could lead to a loss of function by changing the molecule's structure. Molecular dynamics (MD simulations and circular dichroism (CD studies suggest that the CAI-peptide has an intrinsic capacity to form a helical structure, which seems to be critical for the pharmacological effect as revealed by intensive docking calculations and comparison with control peptides. This coupling of the CAI-peptide to a BioShuttle-molecule additionally improved its solubility. Under the conditions described, the HIV-1 CAI peptide was transported into living cells and could be localized in the vicinity of the mitochondria.

  9. HIV-1 Capsid Assembly Inhibitor (CAI) Peptide: Structural Preferences and Delivery into Human Embryonic Lung Cells and Lymphocytes

    Science.gov (United States)

    Braun, Klaus; Frank, Martin; Pipkorn, Rüdiger; Reed, Jennifer; Spring, Herbert; Debus, Jürgen; Didinger, Bernd; von der Lieth, Claus-Wilhelm; Wiessler, Manfred; Waldeck, Waldemar

    2008-01-01

    The Human immunodeficiency virus 1 derived capsid assembly inhibitor peptide (HIV-1 CAI-peptide) is a promising lead candidate for anti-HIV drug development. Its drawback, however, is that it cannot permeate cells directly. Here we report the transport of the pharmacologically active CAI-peptide into human lymphocytes and Human Embryonic Lung cells (HEL) using the BioShuttle platform. Generally, the transfer of pharmacologically active substances across membranes, demonstrated by confocal laser scanning microscopy (CLSM), could lead to a loss of function by changing the molecule's structure. Molecular dynamics (MD) simulations and circular dichroism (CD) studies suggest that the CAI-peptide has an intrinsic capacity to form a helical structure, which seems to be critical for the pharmacological effect as revealed by intensive docking calculations and comparison with control peptides. This coupling of the CAI-peptide to a BioShuttle-molecule additionally improved its solubility. Under the conditions described, the HIV-1 CAI peptide was transported into living cells and could be localized in the vicinity of the mitochondria. PMID:18695744

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  12. COMPUTING

    CERN Document Server

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  13. Practical Use of Computationally Frugal Model Analysis Methods.

    Science.gov (United States)

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  14. Emerging Trends and Statistical Analysis in Computational Modeling in Agriculture

    Directory of Open Access Journals (Sweden)

    Sunil Kumar

    2015-03-01

    Full Text Available In this paper the authors have tried to describe emerging trend in computational modelling used in the sphere of agriculture. Agricultural computational modelling with the use of intelligence techniques for computing the agricultural output by providing minimum input data to lessen the time through cutting down the multi locational field trials and also the labours and other inputs is getting momentum. Development of locally suitable integrated farming systems (IFS is the utmost need of the day, particularly in India where about 95% farms are under small and marginal holding size. Optimization of the size and number of the various enterprises to the desired IFS model for a particular set of agro-climate is essential components of the research to sustain the agricultural productivity for not only filling the stomach of the bourgeoning population of the country, but also to enhance the nutritional security and farms return for quality life. Review of literature pertaining to emerging trends in computational modelling applied in field of agriculture is done and described below for the purpose of understanding its trends mechanism behavior and its applications. Computational modelling is increasingly effective for designing and analysis of the system. Computa-tional modelling is an important tool to analyses the effect of different scenarios of climate and management options on the farming systems and its interaction among themselves. Further, authors have also highlighted the applications of computational modeling in integrated farming system, crops, weather, soil, climate, horticulture and statistical used in agriculture which can show the path to the agriculture researcher and rural farming community to replace some of the traditional techniques.

  15. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  16. Assessing computer waste generation in Chile using material flow analysis.

    Science.gov (United States)

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  17. Artificial Intelligence and Computer Assisted Instruction. CITE Report No. 4.

    Science.gov (United States)

    Elsom-Cook, Mark

    The purpose of the paper is to outline some of the major ways in which artificial intelligence research and techniques can affect usage of computers in an educational environment. The role of artificial intelligence is defined, and the difference between Computer Aided Instruction (CAI) and Intelligent Computer Aided Instruction (ICAI) is…

  18. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  19. 浅谈席勒对蔡元培美学思想的几点影响%Concise Remarks-on Several Influences of Schiller on Cai Yuanpei's Aesthetics Thought

    Institute of Scientific and Technical Information of China (English)

    李慧国; 隆占玺

    2012-01-01

    In the early twentieth century, Cai Yuanpei has an important position in the history of Chinese education. He proposed "Substituting Aesthetic Education for Religion" which embodied his main aesthetics thought. I conducted in- depth analysis on his aesthetics thought and excavated deeply on Germany aesthetes Schiller's influence on Cai Yuanpei's aesthetics thought in this paper, which helped us have a deeper understanding about the connotation of Cai Yuanpei' s aesthetics thought.%蔡元培在二十世纪早期的中国教育史上有着举足轻重的地位,他所提出的“以美育代宗教”的口号彰显了其主要的美学思想。文章对蔡元培美学思想重新展开深入剖析,就德国美学家席勒对蔡元培美学思想所产生的影响进行深度挖掘,使我们对蔡元培美学思想的内涵有更深的认识。

  20. Computational methods for efficient structural reliability and reliability sensitivity analysis

    Science.gov (United States)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  1. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  2. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  3. Ontology-based metrics computation for business process analysis

    OpenAIRE

    Carlos Pedrinaci; John Domingue

    2009-01-01

    Business Process Management (BPM) aims to support the whole life-cycle necessary to deploy and maintain business processes in organisations. Crucial within the BPM lifecycle is the analysis of deployed processes. Analysing business processes requires computing metrics that can help determining the health of business activities and thus the whole enterprise. However, the degree of automation currently achieved cannot support the level of reactivity and adaptation demanded by businesses. In thi...

  4. Computational Methods for Failure Analysis and Life Prediction

    Science.gov (United States)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  5. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; Ramesh, N; Vijayalakshmi, K.; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  6. CAVASS: A Computer-Assisted Visualization and Analysis Software System

    OpenAIRE

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-01-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-so...

  7. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  8. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    Science.gov (United States)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  9. Variance analysis. Part II, The use of computers.

    Science.gov (United States)

    Finkler, S A

    1991-09-01

    This is the second in a two-part series on variance analysis. In the first article (JONA, July/August 1991), the author discussed flexible budgeting, including the calculation of price, quantity, volume, and acuity variances. In this second article, the author focuses on the use of computers by nurse managers to aid in the process of calculating, understanding, and justifying variances. PMID:1919788

  10. Caiçaras, caboclos and natural resources: rules and scale patterns Caiçaras, caboclos e recursos naturais: regras e padrões de escala

    Directory of Open Access Journals (Sweden)

    Alpina Begossi

    1999-12-01

    Full Text Available One important question concerning the sustainability of local or native populations refers to their interaction with local and global institutions. We should expect that populations with capacity to interact economically and politically with institutions, might show a better chance for their ecological and cultural continuity, as well as for their system of trade and subsistence. The level of ecological and social interaction of local populations, following concepts from ecology, occurs on different scales: for example, from the territories of individual fishermen on the Atlantic Forest coast to organizations of community Extractive Reserves in the Amazon. The scale of organization (individual/family/community may influence the capacity to deal with institutions. This study analyses how Brazilian native populations, especially caiçaras of the Atlantic Forest coast, and caboclos from the Amazon, have interacted with regional, national and global institutions, concerning environmental demands. Concepts such as common management, natural capital, resilience and sustainability are useful to understand these illustrative cases.Uma questão importante da sustentabilidade de populações locais ou nativas se refere à interação com as instituições locais e globais. Podemos esperar que populações que demonstrem capacidade de interagir de forma econômica e política com as instituições apresentem também uma chance maior de continuidade cultural e ecológica, assim como de seus sistemas de troca e subsistência. O nível da interação ecológica e social das populações locais, seguindo conceitos da ecologia, ocorrem sob escalas diferentes: por exemplo, dos territórios individuais de pescadores da Mata Atlântica às organizações de comunidades em Reservas Extrativistas, na Amazônia. A escala organizacional (individual/familiar/comunitária pode influenciar a capacidade de lidar com as instituições.Esse estudo analisa como popula

  11. [Demotion and promotion of CAI Jing and the Medicine School's establishment and abolition three times in the North Song dynasty].

    Science.gov (United States)

    Li, Yu-Qing

    2011-03-01

    CAI Jing was appointed the prime minister in the Chongning period of Song Hui-tsung. The Medical School was moved to the Imperial College from Taichang Temple. It used a 3-year education system and divided graduates into three grades. The preferential policies promised top students as 8 or 9-rank official position, which attracted a lot of intellectuals into the field of medicine. CAI Jing was demoted three times respectively in the fifth year of the Chongning period, the third year of the Daguan period and the second year of the Zhenghe period, and was promoted again after each demotion. Influenced by changes of CAI Jing's position and relative policies, the Medical School was also established and abolished three times. PMID:21624269

  12. A braça da rede, uma técnica caiçara de medir

    OpenAIRE

    Gilberto Chieus Jr.

    2009-01-01

    Este artigo relata como os caiçaras da cidade de Ubatuba litoral norte paulista medem suas redes de pesca. Mas antes de estar analisando sua técnica de medir estaremos fazendo uma pequena abordagem da cultura caiçara e suas transformações. Em seguida mostraremos alguns momentos históricos da construção do metro. Depois como os caiçaras medem suas redes e o problema ocorrido no Brasil na implantação do sistema métrico decimal e a resistência de determinadas civilizações que se utiliza de outro...

  13. Spacelab data analysis using the space plasma computer analysis network (SCAN) system

    Science.gov (United States)

    Green, J. L.

    1984-01-01

    The Space-plasma Computer Analysis Network (SCAN) currently connects a large number of U.S. Spacelab investigators into a common computer network. Used primarily by plasma physics researchers at present, SCAN provides access to Spacelab investigators in other areas of space science, to Spacelab and non-Spacelab correlative data bases, and to large Class VI computational facilities for modeling. SCAN links computers together at remote institutions used by space researchers, utilizing commercially available software for computer-to-computer communications. Started by the NASA's Office of Space Science in mid 1980, SCAN presently contains ten system nodes located at major universities and space research laboratories, with fourteen new nodes projected for the near future. The Stanford University computer gateways allow SCAN users to connect onto the ARPANET and TELENET overseas networks.

  14. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  15. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  16. Shielding analysis methods available in the scale computational system

    International Nuclear Information System (INIS)

    Computational tools have been included in the SCALE system to allow shielding analysis to be performed using both discrete-ordinates and Monte Carlo techniques. One-dimensional discrete ordinates analyses are performed with the XSDRNPM-S module, and point dose rates outside the shield are calculated with the XSDOSE module. Multidimensional analyses are performed with the MORSE-SGC/S Monte Carlo module. This paper will review the above modules and the four Shielding Analysis Sequences (SAS) developed for the SCALE system. 7 refs., 8 figs

  17. Ca-Fe and Alkali-Halide Alteration of an Allende Type B CAI: Aqueous Alteration in Nebular or Asteroidal Settings

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Simon, S. B.; Grossman, L.

    2012-01-01

    Ca-Fe and alkali-halide alteration of CAIs is often attributed to aqueous alteration by fluids circulating on asteroidal parent bodies after the various chondritic components have been assembled, although debate continues about the roles of asteroidal vs. nebular modification processes [1-7]. Here we report de-tailed observations of alteration products in a large Type B2 CAI, TS4 from Allende, one of the oxidized subgroup of CV3s, and propose a speculative model for aqueous alteration of CAIs in a nebular setting. Ca-Fe alteration in this CAI consists predominantly of end-member hedenbergite, end-member andradite, and compositionally variable, magnesian high-Ca pyroxene. These phases are strongly concentrated in an unusual "nodule" enclosed within the interior of the CAI (Fig. 1). The Ca, Fe-rich nodule superficially resembles a clast that pre-dated and was engulfed by the CAI, but closer inspection shows that relic spinel grains are enclosed in the nodule, and corroded CAI primary phases interfinger with the Fe-rich phases at the nodule s margins. This CAI also contains abundant sodalite and nepheline (alkali-halide) alteration that occurs around the rims of the CAI, but also penetrates more deeply into the CAI. The two types of alteration (Ca-Fe and alkali-halide) are adjacent, and very fine-grained Fe-rich phases are associated with sodalite-rich regions. Both types of alteration appear to be replacive; if that is true, it would require substantial introduction of Fe, and transport of elements (Ti, Al and Mg) out of the nodule, and introduction of Na and Cl into alkali-halide rich zones. Parts of the CAI have been extensively metasomatized.

  18. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  19. Calcium and Titanium Isotope Fractionation in CAIS: Tracers of Condensation and Inheritance in the Early Solar Protoplanetary Disk

    Science.gov (United States)

    Simon, J. I.; Jordan, M. K.; Tappa, M. J.; Kohl, I. E.; Young, E. D.

    2016-01-01

    The chemical and isotopic compositions of calcium-aluminum-rich inclusions (CAIs) can be used to understand the conditions present in the protoplantary disk where they formed. The isotopic compositions of these early-formed nebular materials are largely controlled by chemical volatility. The isotopic effects of evaporation/sublimation, which are well explained by both theory and experimental work, lead to enrichments of the heavy isotopes that are often exhibited by the moderately refractory elements Mg and Si. Less well understood are the isotopic effects of condensation, which limits our ability to determine whether a CAI is a primary condensate and/or retains any evidence of its primordial formation history.

  20. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    Science.gov (United States)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  1. Developing CAI project with multimedia control in Visual Basic about image%用Visual Basic软件开发多媒体CAI课件——图像篇

    Institute of Scientific and Technical Information of China (English)

    曲双为

    2001-01-01

    Some skills that how to treat picture for multimedia Computer Aided Introduction (CAI) are introduced in this paper. This skills are based on Microsoft Visual Basic 6.0 which is a popular programming tool used by many programmers at present.%从编程的角度出发,介绍了以Microsoft Visual Basic 6.0为开发平台,在开发多媒体CAI课件的过程中对图像的部分处理技巧.

  2. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    The education and training system for Accident Management was developed by the Japanese BWR group and Hitachi Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the education and training system with computer simulations. Both systems are designed to be executed on personal computers. The outlines of the CAI education system and the education and training system with simulator are reported below. These systems provides plant operators and technical support center staff with the effective education and training for accident management. (author)

  5. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  7. Prof. CAI Ronggen and Cooperators Awarded for studies on dynamical and thermodynamical properties and the intrinsic relation between them in gravity

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Profs. CAI Ronggen (Cai, Rong-Gen), WANG Bin (Wang, Bin) and ZHANG Yuanzhong (Zhang, Yuan- Zhong) at the CAS Institute of Theoretical Physics and their collaborators were awarded a second prize from the State Natural Science Award for their systematic research on the dynamical and thermodynamical properties and the relations between them in gravity.

  8. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  9. Computer-aided Symbolic Analysis for the Active Network

    Institute of Scientific and Technical Information of China (English)

    徐望人; 徐静波

    2004-01-01

    The totally coded method (TCM) reveal the same law which governing the gain calculating for signal flow graph as Mason formula does. This algorithm is carried out merely in the domain of code operation. Based on pure code algorithm, it is more efficiency because any figure searching is no longer necessary. The code-series (CS), which are organized from node association table, have the holo-information nature, so that both the content and the sign of each gain-term can be determined via the coded method. The principle of this method is obvious and it is suited for computer programming. The capability of the computer-aided analysis for the active network, such as operation amplifier network, can be enhanced.

  10. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  11. COMPUTER SIMULATION: COMPARATIVE ANALYSIS OF SOFTWARES ARENA® AND PROMODEL®

    Directory of Open Access Journals (Sweden)

    Luiz Enéias Zanetti Cardoso

    2016-04-01

    Full Text Available The computer simulation is not exclusive areas of Logistics and Production, implementation takes place within the limits of technical expertise of professionals. Although not widespread at present, there is a rise of projection in use, as the numerous application possibilities, if properly modeled in reality presented face. This article proposes to present comparative and qualitative analysis of two computer simulation software, version Arena® 14,000 Student and ProModel® RunTimeSilve version - Demo, according to the following criteria: desktop, access to commands, ease in developing the software model and accessories, and can be seen the main features of each simulation software, as well as the differences between their interfaces, however, both were confirmed as great tools to support management processes.

  12. Linking CAI abundance to polarimetric response in a population of ancient asteroids

    Science.gov (United States)

    Devogele, Maxime; Tanga, Paolo; Bendjoya, Philippe; Rivet, Jean-Pierre; Surdej, Jean; Bus, Schelte J.; Sunshine, Jessica M.; Cellino, Alberto; Campins, Humberto; Licandro, Javier; Pinilla-Alonso, Noemi; Carry, Benoit

    2016-10-01

    Polarimetry constitutes one of the fundamental tools for characterizing the surface texture and composition of airless Solar System bodies. In 2006, polarimetric observations led to the discovery of a new type of asteroids, which displays a peculiar polarimetric response. These asteroids are collectively known as "Barbarians", from (234) Barbara the first discovered one.The most commonly accepted explanation for this perculiar polarization response seems to be the presence of a high percentage of fluffy-type Calcium Aluminium-rich Inclusions (CAIs), whose optical properties could produce the observed polarization. Their reflectance spectra also exibit an absorption feature in the near-infrared around 2.1-2.2 microns, that is characteristic of this peculiar group.Based on these results, we organized a systematic polarimetric and near-infrared observational campaign of known Barbarians or candidate asteroids. These campaigns include members of the family of 1040 Klumpkea, 2085 Henan and 729 Watsonia, which are known to contain Barbarian and/or L-type asteroids also suspected to have such a polarimetric behaviour. We have made use of the ToPo polarimeter at the 1m telescope of the Centre pédagogique Planète et Univers (C2PU, Observatoire de la Côte d'Azur, France). The spectroscopic observations in the near-infrared were obtained with the SpeX instrument at the NASA's InfraRed Telescope Facility (IRTF).By combining polarimetry and spectroscopy we find a correlation between the abundance of CAIs and the inversion angle of the phase-polarization curve of Barbarian asteroids. This is the first time that a direct link has been established between a specific polarimetric response and the surface composition of asteroids. In addition, we find a considerable variety of CAI abundance from one object to the other, consistent with a wide range of possible albedos. Since these asteroids constitute a reservoir of primitive Solar System material, understanding their origin can

  13. Computational analysis of microRNA function in heart development

    Institute of Scientific and Technical Information of China (English)

    Ganqiang Liu; Min Ding; Jiajia Chen; Jin yan Huang; Haiyun Wang; Qing Jing; Bairong Shen

    2010-01-01

     Emerging evidence suggests that specific spatio-temporal microRNA(miRNA)expression is required for heartdevelopment.In recent years,hundreds of miRNAs have been discovered.In contrast,functional annotations areavailable only for a very small fraction of these regulatory molecules.In order to provide a global perspective for the biologists who study the relationship between differen tially expressed miRNAs and heart development,we employed computational analysis to uncover the specific cellular processes and biological pathways targeted by miRNAs in mouse heart development.Here,we utilized Gene Ontology(GO)categories,KEGG Pathway,and GeneGo Pathway Maps as a gene functional annotation system for miRNA target enrichment analysis.The target genes of miRNAs were found to be enriched in functional categories and pathway maps in which miRNAs couldplay important roles during heart development.Meanwhile,we developed miRHrt(http://sysbio.suda.edu.cn/mirhrt/),a database aiming to provide a comprehen sive resource of miRNA function in regulating heart development.These computational analysis results effec tively illustrated the correlation of differentially expressed miRNAs with cellular functions and heart development.We hope that the identified novel heart developmentassociated pathways and the database presented here would facilitate further understanding of the roles and mechanisms of miRNAs in heart development.

  14. Analysis and computational dissection of molecular signature multiplicity.

    Directory of Open Access Journals (Sweden)

    Alexander Statnikov

    2010-05-01

    Full Text Available Molecular signatures are computational or mathematical models created to diagnose disease and other phenotypes and to predict clinical outcomes and response to treatment. It is widely recognized that molecular signatures constitute one of the most important translational and basic science developments enabled by recent high-throughput molecular assays. A perplexing phenomenon that characterizes high-throughput data analysis is the ubiquitous multiplicity of molecular signatures. Multiplicity is a special form of data analysis instability in which different analysis methods used on the same data, or different samples from the same population lead to different but apparently maximally predictive signatures. This phenomenon has far-reaching implications for biological discovery and development of next generation patient diagnostics and personalized treatments. Currently the causes and interpretation of signature multiplicity are unknown, and several, often contradictory, conjectures have been made to explain it. We present a formal characterization of signature multiplicity and a new efficient algorithm that offers theoretical guarantees for extracting the set of maximally predictive and non-redundant signatures independent of distribution. The new algorithm identifies exactly the set of optimal signatures in controlled experiments and yields signatures with significantly better predictivity and reproducibility than previous algorithms in human microarray gene expression datasets. Our results shed light on the causes of signature multiplicity, provide computational tools for studying it empirically and introduce a framework for in silico bioequivalence of this important new class of diagnostic and personalized medicine modalities.

  15. Computing the surveillance error grid analysis: procedure and examples.

    Science.gov (United States)

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings. PMID:25562887

  16. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  17. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  18. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  19. Meshing analysis of toroidal drive by computer algebra system

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Presents the meshing analysis based on the Computer Algebra System to make it easier to deduce complex formulas while the expression of more complicated surface equations are visualized, by which, the contact line, mesh ing bordlines and undercut bordlines of toroidal drive are deduced, and the results obtained are consistent with the re sults discussed in literature[1] , and concludes that the absolute value of the induced normal curvature is usually smaller (less than 0.12, for example), and it increases as parameters ψ2, V and R increase, decreases as parameter r in creases, and hardly varies with W2, and the variation with a, i21 is not definite.

  20. Equivalent Bar Conceptions for Computer Analysis of Pantographic Foldable Structures

    Institute of Scientific and Technical Information of China (English)

    陈务军; 付功义; 何艳丽; 董石麟

    2003-01-01

    An equivalent bar conception is firstly developed for the computer analysis of pantographic foldablestructures. The uniplet of two three-node beam elements is assumed as a six-bar assembly with respect to leastnorm least square solution for the elastic strain energy equality. The equilibrium equation is developed for the e-quivalent models, and the internal forces formulated sequently for backup calculation. This procedure is provedpractical for some engineering, and some interesting concepts proposed. Finally, three numerical tests are present-ed.

  1. Computational issue in the analysis of adaptive control systems

    Science.gov (United States)

    Kosut, Robert L.

    1989-01-01

    Adaptive systems under slow parameter adaption can be analyzed by the method of averaging. This provides a means to assess stability (and instability) properties of most adaptive systems, either continuous-time or (more importantly for practice) discrete-time, as well as providing an estimate of the region of attraction. Although the method of averaging is conceptually straightforward, even simple examples are well beyond hand calculations. Specific software tools are proposed which can provide the basis for user-friendly environment to perform the necessary computations involved in the averaging analysis.

  2. Analysis of Network Performance for Computer Communication Systems with Benchmark

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper introduced a performance evaluating approach of computer communication system based on the simulation and measurement technology, and discussed its evaluating models. The result of our experiment showed that the outcome of practical measurement on Ether-LAN fitted in well with the theoreticai analysis. The approach we presented can be used to define various kinds of artificially simulated load models conveniently, build all kinds of network application environments in a flexible way, and exert sufficientiy the widely-used and high-precision features of the traditional simulation technology and the reality,reliability, adaptability features of measurement technology.

  3. Computer analysis of general linear networks using digraphs.

    Science.gov (United States)

    Mcclenahan, J. O.; Chan, S.-P.

    1972-01-01

    Investigation of the application of digraphs in analyzing general electronic networks, and development of a computer program based on a particular digraph method developed by Chen. The Chen digraph method is a topological method for solution of networks and serves as a shortcut when hand calculations are required. The advantage offered by this method of analysis is that the results are in symbolic form. It is limited, however, by the size of network that may be handled. Usually hand calculations become too tedious for networks larger than about five nodes, depending on how many elements the network contains. Direct determinant expansion for a five-node network is a very tedious process also.

  4. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    Science.gov (United States)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  5. G-computation demonstration in causal mediation analysis.

    Science.gov (United States)

    Wang, Aolin; Arah, Onyebuchi A

    2015-10-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings. PMID:26537707

  6. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  7. From Corporate Social Responsibility, through Entrepreneurial Orientation, to Knowledge Sharing: A Study in Cai Luong (Renovated Theatre) Theatre Companies

    Science.gov (United States)

    Tuan, Luu Trong

    2015-01-01

    Purpose: This paper aims to examine the role of antecedents such as corporate social responsibility (CSR) and entrepreneurial orientation in the chain effect to knowledge sharing among members of Cai Luong theatre companies in the Vietnamese context. Knowledge sharing contributes to the depth of the knowledge pool of both the individuals and the…

  8. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil

    Directory of Open Access Journals (Sweden)

    Alves Rômulo RN

    2009-11-01

    Full Text Available Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers, with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 native residents. Data were checked through systematic observations and collection of zoological material. Results The dependence on the terrestrial fauna by Caiçaras is especially for food and medicine. The main species used are Didelphis spp., Dasyprocta azarae, Dasypus novemcinctus, and small birds (several species of Turdidae. Contrasting with a high dependency on terrestrial fauna resources by native Amazonians, the Caiçaras do not show a constant dependency on these resources. Nevertheless, the occasional hunting of native animals represents a complimentary source of animal protein. Conclusion Indigenous or local knowledge on native resources is important in order to promote local development in a sustainable way, and can help to conserve biodiversity, particularly if the resource is sporadically used and not commercially exploited.

  9. On native Danish learners' challenges in distinguishing /tai/, /cai/ and /zai/

    DEFF Research Database (Denmark)

    Sloos, Marjoleine; Zhang, Chun

    2015-01-01

    With a growing interest in learning Chinese globally, there is a growing interest for phonologists and language instructors to understand how nonnative Chinese learners perceive the Chinese sound inventory. We experimentally investigated the Danish (L 1) speaker’s perception of three Mandarin...... results show that beginner learners perform on chance level regarding the distinction between t and z and between c and z. The reason is that in Danish, which has an aspiration contrast between plosives (like Chinese) /th/ is variably pronounced as affricated /ts/ and many speakers are unaware of this...... optional variation. This inhibits the distinction between Chinese t and z (pronounced as /th ts/). Further, Danish has no affricates, which makes the distinction between different affricates based on the aspiration contrast (like cai-zai) particularly difficult....

  10. The Impact of Different Forms of Multimedia CAI on Students' Science Achievement.

    Science.gov (United States)

    Chang, Chun-Yen

    2002-01-01

    Describes a study that explored the effects of teacher-centered versus student-centered multimedia computer-assisted instruction on the science achievements of tenth-grade students in Taiwan. Results of analysis of covariance on pretests-posttest scores showed the teacher-centered approach was more effective in promoting students' science…

  11. Dietary Changes over Time in a Caiçara Community from the Brazilian Atlantic Forest

    Directory of Open Access Journals (Sweden)

    Priscila L. MacCord

    2006-12-01

    Full Text Available Because they are occurring at an accelerated pace, changes in the livelihoods of local coastal communities, including nutritional aspects, have been a subject of interest in human ecology. The aim of this study is to explore the dietary changes, particularly in the consumption of animal protein, that have taken place in Puruba Beach, a rural community of caiçaras on the São Paulo Coast, Brazil, over the 10-yr period from 1992–1993 to 2002–2003. Data were collected during six months in 1992–1993 and during the same months in 2002–2003 using the 24-hr recall method. We found an increasing dependence on external products in the most recent period, along with a reduction in fish consumption and in the number of fish species eaten. These changes, possibly associated with other nonmeasured factors such as overfishing and unplanned tourism, may cause food delocalization and a reduction in the use of natural resources. Although the consequences for conservation efforts in the Atlantic Forest and the survival of the caiçaras must still be evaluated, these local inhabitants may be finding a way to reconcile both the old and the new dietary patterns by keeping their houses in the community while looking for sources of income other than natural resources. The prospect shown here may reveal facets that can influence the maintenance of this and other communities undergoing similar processes by, for example, shedding some light on the ecological and economical processes that may occur within their environment and in turn affect the conservation of the resources upon which the local inhabitants depend.

  12. Analysis of CERN computing infrastructure and monitoring data

    Science.gov (United States)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  13. Computer-aided strength analysis of the modernized freight wagon

    Science.gov (United States)

    Płaczek, M.; Wróbel, A.; Baier, A.

    2015-11-01

    In the paper results of computer-aided strength analysis of the modernized freight wagon based on Finite Element Method are presented. CAD model of the considered freight wagon was created and its strength was analysed in agreement with norms described the way of such kind of freight wagons testing. Then, the model of the analysed freight wagon was modernized by adding composite panels covering the inner surface of the vehicle body. Strength analysis was carried out once again and obtained results were juxtaposed. This work was carried out in order to verify the influence of composite panels on the strength of the freight car body and to estimate the possibility of reducing the steel shell thickness of the box in order to reduce weight of the freight wagon.

  14. Chest x-ray analysis by computer: final technical report

    International Nuclear Information System (INIS)

    The purpose of this study was to evaluate and demonstrate the feasibility of the automated analysis of chest x-rays for the classification of pneumoconiosis films according to the U.I.C.C./Cincinnati standard films. Toward this end, computer programs simulating the proposed systems were prepared. Using these programs, the authors then examined three sets of chest radiographs to determine the extent of pneumoconiosis present. The results of the examinations of these x-rays clearly indicated the feasibility of the proposed system. Based on the outcome of these examinations, a complete set of hardware and software specifications were established for a system which can be used for the large scale automatic analysis of chest x-rays

  15. Modern wing flutter analysis by computational fluid dynamics methods

    Science.gov (United States)

    Cunningham, Herbert J.; Batina, John T.; Bennett, Robert M.

    1988-01-01

    The application and assessment of the recently developed CAP-TSD transonic small-disturbance code for flutter prediction is described. The CAP-TSD code has been developed for aeroelastic analysis of complete aircraft configurations and was previously applied to the calculation of steady and unsteady pressures with favorable results. Generalized aerodynamic forces and flutter characteristics are calculated and compared with linear theory results and with experimental data for a 45 deg sweptback wing. These results are in good agreement with the experimental flutter data which is the first step toward validating CAP-TSD for general transonic aeroelastic applications. The paper presents these results and comparisons along with general remarks regarding modern wing flutter analysis by computational fluid dynamics methods.

  16. PERFORMANCE ANALYSIS OF SOFT COMPUTING TECHNIQUES FOR CLASSIFYING CARDIAC ARRHYTHMIA

    Directory of Open Access Journals (Sweden)

    R GANESH KUMAR

    2014-01-01

    Full Text Available Cardiovascular diseases kill more people than other diseases. Arrhythmia is a common term used for cardiac rhythm deviating from normal sinus rhythm. Many heart diseases are detected through electrocardiograms (ECG analysis. Manual analysis of ECG is time consuming and error prone. Thus, an automated system for detecting arrhythmia in ECG signals gains importance. Features are extracted from time series ECG data with Discrete Cosine Transform (DCT computing the distance between RR waves. The feature is the beat’s extracted RR interval. Frequency domain extracted features are classified using Classification and Regression Tree (CART, Radial Basis Function (RBF, Support Vector Machine (SVM and Multilayer Perceptron Neural Network (MLP-NN. Experiments were conducted on the MIT-BIH arrhythmia database.

  17. Electronic Forms-Based Computing for Evidentiary Analysis

    Directory of Open Access Journals (Sweden)

    Andy Luse

    2009-09-01

    Full Text Available The paperwork associated with evidentiary collection and analysis is a highly repetitive and time-consuming process which often involves duplication of work and can frequently result in documentary errors. Electronic entry of evidencerelated information can facilitate greater accuracy and less time spent on data entry. This manuscript describes a general framework for the implementation of an electronic tablet-based system for evidentiary processing. This framework is then utilized in the design and implementation of an electronic tablet-based evidentiary input prototype system developed for use by forensic laboratories which serves as a verification of the proposed framework. The manuscript concludes with a discussion of implications and recommendations for the implementation and use of tablet-based computing for evidence analysis.

  18. A computer program (MACPUMP) for interactive aquifer-test analysis

    Science.gov (United States)

    Day-Lewis, F. D.; Person, M.A.; Konikow, L.F.

    1995-01-01

    This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.

  19. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  20. Analysis on Phase Transformation (ATP) Using Computational Thermal Principles (CTP)

    Institute of Scientific and Technical Information of China (English)

    N.Alagurmurthi; K.Palaniradja; V. Soundararajan

    2004-01-01

    Computer analysis based on computational thermal principles to predict the transformation kinetics in steels at varying temperatures is of great practical importance in different areas of heat treatment. As a result, using the theory of transient state heat conduction with convective boundary conditions, an efficient program named "ATP" (Analysis on Phase Transformation) has been developed to determine the temperature distribution under different quenching conditions for different geometries such as plate, cylinder and sphere. In addition to these the microstructures and the corresponding hardness developed during quenching are predicted using Time Temperature Transformation (TTT) diagram incorporated in the analysis. To approve our work, dilation curves, Heisler charts and time-temperature history curve have been generated. This paper deals with basic objective of the program (ATP) determination of temperature, microstructure and hardness distribution and also includes an online prediction of austenite-pearlite and austenite-martensite transformation in steels along with the corresponding retained fractions. The quenching of a cylinder in gases, liquids and liquid metals is analyzed to show the non-liner effect of cylinder diameter on the temperature and microstructures. Further in the program we have considered a typical 1080 steel cylinders quenched in water for predicting and comparing the program results with experimental values and can be extended even to other grades of steels. The numerical results of program are found to be in good agreement with the experimental data obtained. Finally the quenching process analysis described in the study appears to be a promising tool for the design of heat-treatment process parameters for steels.

  1. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  2. The CDF Computing and Analysis System:First Experience

    Institute of Scientific and Technical Information of China (English)

    RickColombo; FedorRatnikov; 等

    2001-01-01

    The Collider Detector at Fermilab(CDF) collaboration records and analyses proton anti-proton interactions with a center-of -mass energy of 2 TeV at the Tevatron,A new collider run,Run II,of the Tevatron started in April.During its more than two year duration the CDF experiment expects to record about 1 PetaByte of data.With its multi-purpose detector and center-of mass energy at the frontier,the experimental program is large and versatile.The over 500 scientists of CDF will engage in searches for new particles,like the Higgs boson or supersymmetric particles,precision measurement of electroweak parameters,like the mass of the W boson,measurement of top quark parameters and a large spectrum of B physics.The experiment has taken data and analysed them in previous runs.For Run II,however,the computing model was changed to incorporate new methodologies.the file format switched.and both data handling and analysis system redesigned to cope with the increased demands.This paper(4-036 at Chep 2001)gives an overview of the CDF Run II compute system with emphasize on areas where the current system does not match initial estimates and projections.For the data handling and analysis system a more detailed description is given.

  3. Computational Particle Physics for Event Generators and Data Analysis

    CERN Document Server

    Perret-Gallix, Denis

    2013-01-01

    High-energy physics data analysis relies heavily on the comparison between experimental and simulated data as stressed lately by the Higgs search at LHC and the recent identification of a Higgs-like new boson. The first link in the full simulation chain is the event generation both for background and for expected signals. Nowadays event generators are based on the automatic computation of matrix element or amplitude for each process of interest. Moreover, recent analysis techniques based on the matrix element likelihood method assign probabilities for every event to belong to any of a given set of possible processes. This method originally used for the top mass measurement, although computing intensive, has shown its power at LHC to extract the new boson signal from the background. Serving both needs, the automatic calculation of matrix element is therefore more than ever of prime importance for particle physics. Initiated in the eighties, the techniques have matured for the lowest order calculations (tree-le...

  4. Computer-Assisted Intervention for Children with Low Numeracy Skills

    Science.gov (United States)

    Rasanen, Pekka; Salminen, Jonna; Wilson, Anna J.; Aunio, Pirjo; Dehaene, Stanislas

    2009-01-01

    We present results of a computer-assisted intervention (CAI) study on number skills in kindergarten children. Children with low numeracy skill (n = 30) were randomly allocated to two treatment groups. The first group played a computer game (The Number Race) which emphasized numerical comparison and was designed to train number sense, while the…

  5. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  6. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  8. The future of computer-aided sperm analysis

    Science.gov (United States)

    Mortimer, Sharon T; van der Horst, Gerhard; Mortimer, David

    2015-01-01

    Computer-aided sperm analysis (CASA) technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6) and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards. PMID:25926614

  9. The future of computer-aided sperm analysis

    Directory of Open Access Journals (Sweden)

    Sharon T Mortimer

    2015-01-01

    Full Text Available Computer-aided sperm analysis (CASA technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6 and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards.

  10. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    Science.gov (United States)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  11. Computational analysis of bacterial RNA-Seq data.

    Science.gov (United States)

    McClure, Ryan; Balasubramanian, Divya; Sun, Yan; Bobrovskyy, Maksym; Sumby, Paul; Genco, Caroline A; Vanderpool, Carin K; Tjaden, Brian

    2013-08-01

    Recent advances in high-throughput RNA sequencing (RNA-seq) have enabled tremendous leaps forward in our understanding of bacterial transcriptomes. However, computational methods for analysis of bacterial transcriptome data have not kept pace with the large and growing data sets generated by RNA-seq technology. Here, we present new algorithms, specific to bacterial gene structures and transcriptomes, for analysis of RNA-seq data. The algorithms are implemented in an open source software system called Rockhopper that supports various stages of bacterial RNA-seq data analysis, including aligning sequencing reads to a genome, constructing transcriptome maps, quantifying transcript abundance, testing for differential gene expression, determining operon structures and visualizing results. We demonstrate the performance of Rockhopper using 2.1 billion sequenced reads from 75 RNA-seq experiments conducted with Escherichia coli, Neisseria gonorrhoeae, Salmonella enterica, Streptococcus pyogenes and Xenorhabdus nematophila. We find that the transcriptome maps generated by our algorithms are highly accurate when compared with focused experimental data from E. coli and N. gonorrhoeae, and we validate our system's ability to identify novel small RNAs, operons and transcription start sites. Our results suggest that Rockhopper can be used for efficient and accurate analysis of bacterial RNA-seq data, and that it can aid with elucidation of bacterial transcriptomes.

  12. 计算机辅助创新驱动的产品概念设计创新设想产生过程模型%Process model of new ideas generation for product conceptual design driven by CAI

    Institute of Scientific and Technical Information of China (English)

    张建辉; 檀润华; 张鹏; 曹国忠

    2013-01-01

    Generation of creative ideas was critical in the product conceptual design process. The obstacle to idea generation in the process was that desigers didn't make full use of knowledge in different fields. Theory of Inventive Problem Solving (TRIZ) based Computer-aided Innovation Systems (CAIs) provided a platform for knowledge application in different fields. Principles of creative idea generation driven by Computer-aided Innovation (CAI) was put forward, the inventive problem was solved based on the design scenario of CAIs, the extended solution space was set up by (Unexpected Discoveries)UXD which was implied in the source design in order to drive the generation of creative idea, then, an integrated process model of creative idea generation for product conceptual design driven by CAI was developed. Idea generation for a safety isolation butterfly valve was carried out using the process model and demonstrated its feasibility.%创新设想产生是产品概念设计阶段的关键环节,影响该阶段设想产生的障碍是设计人员不能很好地利用多学科领域的知识.鉴于此,基于发明问题解决理论的计算机辅助创新软件系统提供了应用多学科领域知识的一个平台.提出了计算机辅助创新驱动的创新设想产生原理,以计算辅助创新软件为设计场景进行问题求解,基于源设计中的未预见的发现建立扩展解空间,驱动创新设想产生,进而建立了计算机辅助创新驱动的产品概念设计创新设想产生过程模型.通过安全隔离蝶阀创新设想产生验证了该模型的可行性.

  13. Feasibility Analysis of Critical Factors Affecting Cloud Computing in Nigeria

    Directory of Open Access Journals (Sweden)

    Eustace Manayi Dogo

    2013-10-01

    Full Text Available Cloud computing is an evolving and new way of delivering computing services and resources over the internet which are managed by third parties at remote sites. Cloud computing is based on existing technologies like web services, Service Oriented Architecture (SOA, web3.0, grid computing and virtualization, etc. Computing services includes data storage, processing and software. Cloud computing is enjoying a lot of buzz in Nigeria due to its perceived economic and operational benefits and stakeholders believe that it will transform the IT industry in Nigeria. Despite all its promises there still exist so many challenges before Cloud computing see the light of the day in Nigeria. This paper delivers an overview of Cloud computing together with its advantages and disadvantages. Thereafter, the challenges and drivers affecting the adoption of Cloud computing in Nigeria are outlined. Finally, recommendations for the adoption of Cloud computing is discussed with Nigeria as a case study.

  14. Application of Computer Integration Technology for Fire Safety Analysis

    Institute of Scientific and Technical Information of China (English)

    SHI Jianyong; LI Yinqing; CHEN Huchuan

    2008-01-01

    With the development of information technology, the fire safety assessment of whole structure or region based on the computer simulation has become a hot topic. However, traditionally, the concemed studies are performed separately for different objectives and difficult to perform an overall evaluation. A new multi-dimensional integration model and methodology for fire safety assessment were presented and two newly developed integrated systems were introduced to demonstrate the function of integration simulation technology in this paper. The first one is the analysis on the fire-resistant behaviors of whole structure under real fire loads. The second one is the study on fire evaluation and emergency rescue of campus based on geography information technology (GIS). Some practical examples are presented to illuminate the advan-tages of computer integration technology on fire safety assessment and emphasize some problems in the simulation. The results show that the multi-dimensional integration model offers a new way and platform for the integrating fire safety assessment of whole structure or region, and the integrated software developed is the useful engineering tools for cost-saving and safe design.

  15. Molecular organization in liquid crystals: A comparative computational analysis

    International Nuclear Information System (INIS)

    A comparative computational analysis of molecular organization in four-nematogenic acids (nOCAC) having two, four, six, and eight carbon atoms in the alkyl chain is carried out with respect to translatory and orientational motions. The evaluation of the atomic charge and dipole moment at each atomic center is performed through the complete neglect differential overlap (CNDO/2) method. The Rayleigh-Schroedinger perturbation theory, along with the multicentered-multipole expansion method, is employed to evaluate the long-range interactions, while the '6-exp' potential function is assumed for short-range interactions. The total interaction-energy values obtained through these computations are used to calculate the probability of each configuration at the phase transition temperature via the Maxwell-Boltzmann formula. Further, the flexibility of various configurations is studied in terms of variation of probability due to small departures from the most probable configuration. A comparative picture of molecular parameters, such as the total energy, binding energy, and total dipole moment, is given. An attempt is made to explain the nematogenic behavior of these liquid crystals in terms of their relative order and, thereby, to develop a molecular model for the liquid crystallinity.

  16. Underground tank vitrification: Field-scale experiments and computational analysis

    International Nuclear Information System (INIS)

    In situ vitrification (ISV) is a thermal waste remediation process developed by researchers at Pacific Northwest Laboratory for stabilization and treatment of soils contaminated with hazardous, radioactive, or mixed wastes. Many underground tanks containing radioactive and hazardous chemical wastes at U.S. Department of Energy sites will soon require remediation. Recent development activities have been pursued to determine if the ISV process is applicable to underground storage tanks. As envisioned, ISV will convert the tank, tank contents, and associated contaminated soil to a glass and crystalline block. Development activities include testing and demonstration on three scales and computational modeling and evaluation. In this paper, the authors describe engineering solutions implemented on the field scale to mitigate unique problems posed by ISV of a confined underground structure along with the associated computational analysis. The ISV process, as applied to underground storage tanks, is depicted. The process is similar to ISV of contaminated soils except the tank also melts and forms a metal ingot at the bottom of the melt

  17. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  18. Multiresolution analysis over simple graphs for brain computer interfaces

    Science.gov (United States)

    Asensio-Cubero, J.; Gan, J. Q.; Palaniappan, R.

    2013-08-01

    Objective. Multiresolution analysis (MRA) offers a useful framework for signal analysis in the temporal and spectral domains, although commonly employed MRA methods may not be the best approach for brain computer interface (BCI) applications. This study aims to develop a new MRA system for extracting tempo-spatial-spectral features for BCI applications based on wavelet lifting over graphs. Approach. This paper proposes a new graph-based transform for wavelet lifting and a tailored simple graph representation for electroencephalography (EEG) data, which results in an MRA system where temporal, spectral and spatial characteristics are used to extract motor imagery features from EEG data. The transformed data is processed within a simple experimental framework to test the classification performance of the new method. Main Results. The proposed method can significantly improve the classification results obtained by various wavelet families using the same methodology. Preliminary results using common spatial patterns as feature extraction method show that we can achieve comparable classification accuracy to more sophisticated methodologies. From the analysis of the results we can obtain insights into the pattern development in the EEG data, which provide useful information for feature basis selection and thus for improving classification performance. Significance. Applying wavelet lifting over graphs is a new approach for handling BCI data. The inherent flexibility of the lifting scheme could lead to new approaches based on the hereby proposed method for further classification performance improvement.

  19. Computer codes for the analysis of flask impact problems

    International Nuclear Information System (INIS)

    This review identifies typical features of the design of transportation flasks and considers some of the analytical tools required for the analysis of impact events. Because of the complexity of the physical problem, it is unlikely that a single code will adequately deal with all the aspects of the impact incident. Candidate codes are identified on the basis of current understanding of their strengths and limitations. It is concluded that the HONDO-II, DYNA3D AND ABAQUS codes which ar already mounted on UKAEA computers will be suitable tools for use in the analysis of experiments conducted in the proposed AEEW programme and of general flask impact problems. Initial attention should be directed at the DYNA3D and ABAQUS codes with HONDO-II being reserved for situations where the three-dimensional elements of DYNA3D may provide uneconomic simulations in planar or axisymmetric geometries. Attention is drawn to the importance of access to suitable mesh generators to create the nodal coordinate and element topology data required by these structural analysis codes. (author)

  20. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity. PMID:27396650

  1. On the Analysis of Sudoku Puzzles by Computers

    OpenAIRE

    北本, 卓也

    2013-01-01

    This paper presents a method to analyze Sudoku puzzles by computers. Generally speaking, it is not so difficult to solve Sudoku puzzles by computers, and many programs to solve Sudoku puzzles are avaiable. However, most of the programs use recursion and backtracking, which is significantly different from methods used by a human. Hence, a human-like method to solve a Sudoku puzzle is unknown even if the solution of the puzzle is computed by compute programs. We created a computer program which...

  2. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  3. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  4. Meta-Analysis and Computer-Mediated Communication.

    Science.gov (United States)

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed. PMID:27154373

  5. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    , as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area......The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  6. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  7. Satellite interference analysis and simulation using personal computers

    Science.gov (United States)

    Kantak, Anil

    1988-03-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  8. Satellite Interference Analysis and Simulation Using Personal Computers

    Science.gov (United States)

    Kantak, Anil

    1988-01-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  9. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  10. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time; (3) software should enable grouping of individual sperm based on one or more attributes so outputs reflect subpopulations or clusters of similar sperm with unique

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  13. A Comparative Analysis of some High Performance Computing Technologies

    Directory of Open Access Journals (Sweden)

    Minakshi Tripathy

    2014-10-01

    Full Text Available Computing is an evolutionary process. As part of this evolution, the computing requirements driven by applications have always outpaced the available technology. The system designers have been always seeking for faster and more efficient systems of computing. During the past decade, many different computer systems supporting high performance computing have emerged. Their taxonomy is based on how their processors, memory and interconnect are laid out. Today’s applications require high computational power as well as high communication performance. The high performance computing provides an approach to parallel processing that yields super computer level performance solving incredibly large and complex problems. This trend makes it very promising to build high performance computing environment with a cost effective approach.

  14. The role of computed tomography in terminal ballistic analysis.

    Science.gov (United States)

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles. PMID:17205351

  15. 论蔡元培的传记写作%On Cai Yuanpei' s Biographical Writing

    Institute of Scientific and Technical Information of China (English)

    赖勤芳

    2012-01-01

    Cai Yuanpei wrote a great many biographical writings in his life though he wasn' t famous for writing biography. In the period of his formative education and studying freely, he accumulated the attainment of biographical writing and cultivated gradually the sense of cultural identification based on Chinese ancient bi- ography. When embarking on political revolution and educational innovation, he became more active in bio- graphical writing, especially choosing those revolutionaries, relatives and friends as leading characters. In his later years, he wrote autobiography with a style of annals suggested by Hu Shi. All in all, it was very obvious that Cai should be a helpful promotion to modem transformation of Chinese biographical idea though he wasn' t determined biographical writer and researcher, including his lack of more profound biographical theories and very self-conscious sense of modem biographical style.%蔡元培并不是一位以写作传记而闻名的文学家,但在一生中写下了大量的传记作品。在接受启蒙教育和“自由”读书的过程中,他积累了良好的传记写作素养,逐渐形成了对中国传记文化的认同感。随着政治革命、教育革新等活动的展开,他更是积极写作各种人物传记,其中的革命家传和亲友传最具特色。晚年在胡适的影响下用年谱体写作自传。尽管蔡元培的志向并不在传记写作及研究上,此外亦无十分深刻的传记理论见解和非常自觉的现代传记文体意识,但对中国传记文学观念的现代转型具有一定的助推作用。

  16. Variance Analysis and Comparison in Computer-Aided Design

    Science.gov (United States)

    Ullrich, T.; Schiffer, T.; Schinko, C.; Fellner, D. W.

    2011-09-01

    The need to analyze and visualize differences of very similar objects arises in many research areas: mesh compression, scan alignment, nominal/actual value comparison, quality management, and surface reconstruction to name a few. In computer graphics, for example, differences of surfaces are used for analyzing mesh processing algorithms such as mesh compression. They are also used to validate reconstruction and fitting results of laser scanned surfaces. As laser scanning has become very important for the acquisition and preservation of artifacts, scanned representations are used for documentation as well as analysis of ancient objects. Detailed mesh comparisons can reveal smallest changes and damages. These analysis and documentation tasks are needed not only in the context of cultural heritage but also in engineering and manufacturing. Differences of surfaces are analyzed to check the quality of productions. Our contribution to this problem is a workflow, which compares a reference / nominal surface with an actual, laser-scanned data set. The reference surface is a procedural model whose accuracy and systematics describe the semantic properties of an object; whereas the laser-scanned object is a real-world data set without any additional semantic information.

  17. CAVASS: a computer-assisted visualization and analysis software system.

    Science.gov (United States)

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-11-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-source toolkits. The development of CAVASS by our group is the next generation of 3DVIEWNIX. CAVASS will be freely available and open source, and it is integrated with toolkits such as Insight Toolkit and Visualization Toolkit. CAVASS runs on Windows, Unix, Linux, and Mac but shares a single code base. Rather than requiring expensive multiprocessor systems, it seamlessly provides for parallel processing via inexpensive clusters of work stations for more time-consuming algorithms. Most importantly, CAVASS is directed at the visualization, processing, and analysis of 3-dimensional and higher-dimensional medical imagery, so support for digital imaging and communication in medicine data and the efficient implementation of algorithms is given paramount importance. PMID:17786517

  18. Quantitative Computed Tomography and image analysis for advanced muscle assessment

    Directory of Open Access Journals (Sweden)

    Kyle Joseph Edmunds

    2016-06-01

    Full Text Available Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration.

  19. A conversational system for the computer analysis of nucleic acid sequences.

    OpenAIRE

    Sege, R; Söll, D.; Ruddle, F H; Queen, C

    1981-01-01

    We present a conversational system for the computer analysis of nucleic acid and protein sequences based on the well-known Queen and Korn program (1). The system can be used by persons with only minimal knowledge of computers.

  20. Design Principles for Computer-Assisted Instruction in Histology Education: An Exploratory Study

    Science.gov (United States)

    Deniz, Hasan; Cakir, Hasan

    2006-01-01

    The purpose of this paper is to describe the development process and the key components of a computer-assisted histology material. Computer-assisted histology material is designed to supplement traditional histology education in a large Midwestern university. Usability information of the computer-assisted instruction (CAI) material was obtained…

  1. Analysis on Cloud Computing Information Security Problems and the Countermeasures

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cloud computing is one of the most popular terms in the present IT industry, as well as one of the most prosperous technology. This paper introduces the concept, principle and characteristics of cloud computing, analyzes information security problems resulted from cloud computing, and puts forward corresponding solutions.

  2. Computational electromagnetic analysis of plasmonic effects in interdigital photodetectors

    Science.gov (United States)

    Hill, Avery M.; Nusir, Ahmad I.; Nguyen, Paul V.; Manasreh, Omar M.; Herzog, Joseph B.

    2014-09-01

    Plasmonic nanostructures have been shown to act as optical antennas that enhance optical devices. This study focuses on computational electromagnetic (CEM) analysis of GaAs photodetectors with gold interdigital electrodes. Experiments have shown that the photoresponse of the devices depend greatly on the electrode spacing and the polarization of the incident light. Smaller electrode spacing and transverse polarization give rise to a larger photoresponse. This computational study will simulate the optical properties of these devices to determine what plasmonic properties and optical enhancement these devices may have. The models will be solving Maxwell's equations with a finite element method (FEM) algorithm provided by the software COMSOL Multiphysics 4.4. The preliminary results gathered from the simulations follow the same trends that were seen in the experimental data collected, that the spectral response increases when the electrode spacing decreases. Also the simulations show that incident light with the electric field polarized transversely across the electrodes produced a larger photocurrent as compared with longitudinal polarization. This dependency is similar to other plasmonic devices. The simulation results compare well with the experimental data. This work also will model enhancement effects in nanostructure devices with dimensions that are smaller than the current samples to lead the way for future nanoscale devices. By seeing the potential effects that the decreased spacing could have, it opens the door to a new set of devices on a smaller scale, potentially ones with a higher level of enhancement for these devices. In addition, the precise modeling and understanding of the effects of the parameters provides avenues to optimize the enhancement of these structures making more efficient photodetectors. Similar structures could also potentially be used for enhanced photovoltaics as well.

  3. Genome Assembly and Computational Analysis Pipelines for Bacterial Pathogens

    KAUST Repository

    Rangkuti, Farania Gama Ardhina

    2011-06-01

    Pathogens lie behind the deadliest pandemics in history. To date, AIDS pandemic has resulted in more than 25 million fatal cases, while tuberculosis and malaria annually claim more than 2 million lives. Comparative genomic analyses are needed to gain insights into the molecular mechanisms of pathogens, but the abundance of biological data dictates that such studies cannot be performed without the assistance of computational approaches. This explains the significant need for computational pipelines for genome assembly and analyses. The aim of this research is to develop such pipelines. This work utilizes various bioinformatics approaches to analyze the high-­throughput genomic sequence data that has been obtained from several strains of bacterial pathogens. A pipeline has been compiled for quality control for sequencing and assembly, and several protocols have been developed to detect contaminations. Visualization has been generated of genomic data in various formats, in addition to alignment, homology detection and sequence variant detection. We have also implemented a metaheuristic algorithm that significantly improves bacterial genome assemblies compared to other known methods. Experiments on Mycobacterium tuberculosis H37Rv data showed that our method resulted in improvement of N50 value of up to 9697% while consistently maintaining high accuracy, covering around 98% of the published reference genome. Other improvement efforts were also implemented, consisting of iterative local assemblies and iterative correction of contiguated bases. Our result expedites the genomic analysis of virulent genes up to single base pair resolution. It is also applicable to virtually every pathogenic microorganism, propelling further research in the control of and protection from pathogen-­associated diseases.

  4. Computer analysis and comparison of chess players' game-playing styles

    OpenAIRE

    Krevs, Urša

    2015-01-01

    Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...

  5. Computer image analysis of toxic fatty degeneration in rat liver.

    Science.gov (United States)

    Stetkiewicz, J; Zieliński, K; Stetkiewicz, I; Koktysz, R

    1989-01-01

    Fatty degeneration of the liver is one of the most frequently observed pathological changes in the experimental estimation of the toxicity of chemical compounds. The intensity of this kind of damage is most often detected by means of a generally accepted scale of points, whereas the classification is performed according to the subjective "feeling" of the pathologist. In modern pathological diagnostics, computer analysis of images is used to perform an objective estimation of the degree of damage to various organs. In order to check the usefulness of this kind of method, comparative biochemical and morphometrical studies were undertaken in trichloroethylene (TRI)-induced fatty degeneration of the liver. TRI was administered to rats intragastrically, in single doses: 1/2; 1/3; 1/4; 1/6 and 1/18 DL50. 24 hours after the administration, the animals were sacrificed. The content of triglycerides in the liver was determined according to Folch et al. (1956). Simple lipids in the histochemical samples were detected by means of staining with a lipotropic, Fat Red 7B. The area of fatty degeneration was estimated in the microscopic samples by the use of an automatic image analyser IBAS 2000 (Kontron). The morphometrical data concerning the area of fatty degeneration in the liver amplified a high degree of correlation with the content of triglycerides (r = 0.89) and the dose of TRI (r = 0.96). The degree of correlation between the biochemical data and the dose of TRI was 0.88. The morphometrical studies performed have proved to be of great use in estimating the degree of fatty degeneration in the liver. This method enables precise, quantitative measuring of this sort of liver damage in the material prepared for routine histopathological analysis. It requires, however, the application of a specialized device for quantitative image analysis.

  6. Computational Analysis of the Hypothalamic Control of Food Intake.

    Science.gov (United States)

    Tabe-Bordbar, Shayan; Anastasio, Thomas J

    2016-01-01

    Food-intake control is mediated by a heterogeneous network of different neural subtypes, distributed over various hypothalamic nuclei and other brain structures, in which each subtype can release more than one neurotransmitter or neurohormone. The complexity of the interactions of these subtypes poses a challenge to understanding their specific contributions to food-intake control, and apparent consistencies in the dataset can be contradicted by new findings. For example, the growing consensus that arcuate nucleus neurons expressing Agouti-related peptide (AgRP neurons) promote feeding, while those expressing pro-opiomelanocortin (POMC neurons) suppress feeding, is contradicted by findings that low AgRP neuron activity and high POMC neuron activity can be associated with high levels of food intake. Similarly, the growing consensus that GABAergic neurons in the lateral hypothalamus suppress feeding is contradicted by findings suggesting the opposite. Yet the complexity of the food-intake control network admits many different network behaviors. It is possible that anomalous associations between the responses of certain neural subtypes and feeding are actually consistent with known interactions, but their effect on feeding depends on the responses of the other neural subtypes in the network. We explored this possibility through computational analysis. We made a computer model of the interactions between the hypothalamic and other neural subtypes known to be involved in food-intake control, and optimized its parameters so that model behavior matched observed behavior over an extensive test battery. We then used specialized computational techniques to search the entire model state space, where each state represents a different configuration of the responses of the units (model neural subtypes) in the network. We found that the anomalous associations between the responses of certain hypothalamic neural subtypes and feeding are actually consistent with the known structure

  7. 基于Delphi的制图课件设计%Engineering Graphics CAI Design Based on Delphi

    Institute of Scientific and Technical Information of China (English)

    蒋先刚; 钟化兰; 涂晓斌

    2001-01-01

    Introduces programming technologies and methods of EngineeringGraphics CAI design based on Delphi, it focus on system configuration and software methods. It also presents methods and skills of using TTreeView component to construct the CAI'S database.%介绍基于Delphi开发环境下的制图课件的设计技术和实现方法。重点介绍画法几何与工程制图CAI课件系统的构造和软件实现方法,提出了用Delphi中的树状显示控件构造和管理制图CAI系统中的数据库设计方法和技巧。

  8. 浅谈CAI多媒体课件的设计制作%Design of CAI Multimedia Courseware

    Institute of Scientific and Technical Information of China (English)

    任旻

    2012-01-01

    The paper mainly introduces the design of CAI multimedia courseware, including the basic content, design of the interface (frame), audio design, video design and animation.%主要介绍CAI多媒体课件的设计制作,包括课件的基本内容、界面(画面)设计、音频、视频、动画及其他。

  9. 被历史错位的蔡襄书法%Cai Xiang's Calligraphy Dislocated by the History

    Institute of Scientific and Technical Information of China (English)

    黄志强

    2012-01-01

    宋代书法"尚意"的思潮,是一代文宗欧阳修、蔡襄和苏轼等人引领下形成的。蔡襄的书法和理论作为中国传统书法文化艺术的组成部分,具有承前启后的作用。然而在很长的历史时期里,学界在研究蔡襄的诗和书法时有"论者或不然",或认为"宋四大家"的书法是蔡京而非蔡襄等悖论。现根据史料,让我们以今人的眼光对蔡襄的书法进行整体的审视和评价。%The calligraphy in Song dynasty had the trend of "Shang Yi" under the guidance of Ouyang Xiu, Cai Xiang and Su Shi. Cai Xiang's calligraphy and theory, as one part of the traditional Chinese culture, have played an important role in linking the past and the future in Chinese calligraphy. However, there were opposite voices in the academic circles during a long time, and furthermore, Cai Jing, instead of Cai Xiang, is one of "four great calligraphers in Song dynasty", which is a fallacy in today's perspective. Now with the help of historical records, this paper gives overall review and evaluation on his works.

  10. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  11. Computer-aided pulmonary image analysis in small animal models

    International Nuclear Information System (INIS)

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases

  12. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  13. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  14. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    OpenAIRE

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to e...

  15. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author)

  16. Computational analysis on plug-in hybrid electric motorcycle chassis

    Science.gov (United States)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  17. Analisis cualitativo asistido por computadora Computer-assisted qualitative analysis

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2003-01-01

    Full Text Available Los objetivos de este ensayo son: por un lado, presentar una aproximación a la experiencia hispanoamericana en el Análisis Cualitativo Asistido por Computadora (ACAC al agrupar mediante un ejercicio de sistematización los trabajos realizados por diversos colegas provenientes de disciplinas afines. Aunque hubiese querido ser exhaustivo y minucioso, como cualquier intento de sistematización de experiencias, en este ejercicio son notables las ausencias y las omisiones. Introducir algunas reflexiones teóricas en torno al papel del ACAC en el desarrollo de la investigación cualitativa a partir de esa sistematización y con particular énfasis en la producción del dato es, por otro lado, objetivo central de esta primera aproximación.The aims of this article are: on the one hand, to present an approximation to the Hispano-American experience on Computer-Assisted Qualitative Data Analysis (CAQDAS, grouping as a systematization exercise the works carried out by several colleagues from related disciplines. Although attempting to be exhaustive and thorough - as in any attempt at systematizing experiences - this exercise presents clear lacks and omissions. On the other hand, to introduce some theoretical reflections about the role played by CAQDAS in the development of qualitative investigation after that systematization, with a specific focus on data generation.

  18. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  19. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  20. Design of airborne wind turbine and computational fluid dynamics analysis

    Science.gov (United States)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.

  1. Reliability and safety analysis of redundant vehicle management computer system

    Institute of Scientific and Technical Information of China (English)

    Shi Jian; Meng Yixuan; Wang Shaoping; Bian Mengmeng; Yan Dungong

    2013-01-01

    Redundant techniques are widely adopted in vehicle management computer (VMC) to ensure that VMC has high reliability and safety. At the same time, it makes VMC have special char-acteristics, e.g., failure correlation, event simultaneity, and failure self-recovery. Accordingly, the reliability and safety analysis to redundant VMC system (RVMCS) becomes more difficult. Aimed at the difficulties in RVMCS reliability modeling, this paper adopts generalized stochastic Petri nets to establish the reliability and safety models of RVMCS. Then this paper analyzes RVMCS oper-ating states and potential threats to flight control system. It is verified by simulation that the reli-ability of VMC is not the product of hardware reliability and software reliability, and the interactions between hardware and software faults can reduce the real reliability of VMC obviously. Furthermore, the failure undetected states and false alarming states inevitably exist in RVMCS due to the influences of limited fault monitoring coverage and false alarming probability of fault mon-itoring devices (FMD). RVMCS operating in some failure undetected states will produce fatal threats to the safety of flight control system. RVMCS operating in some false alarming states will reduce utility of RVMCS obviously. The results abstracted in this paper can guide reliable VMC and efficient FMD designs. The methods adopted in this paper can also be used to analyze other intelligent systems’ reliability.

  2. Analysis of activities for learning computer science unplugged

    OpenAIRE

    Zaviršek, Manca

    2013-01-01

    In following thesis I delve into activities for learning computer science unplugged available at Vidra website. Some activities are analyzed on the basis of learning objectives of Slovenian primary school curriculum for computer science and ACM K-12 Computer Science Curriculum. The main objective of this thesis is to estimate how much the activities match both curriculums. Within the thesis I analyze the goals of those activities in correlation to revised Bloom's taxonomy. By means...

  3. A citation analysis of top research papers of computer science

    OpenAIRE

    Hussain, Akhtar; Swain, Dillip-K.

    2011-01-01

    The study intends to evaluate the top papers of Computer Science as reflected in Science Direct. Moreover, it aims to find out authorship pattern, ranking of authors, ranking of country productivity, ranking of journals, and highly cited papers of Computer Science. The citations data have been collected from the quarterly list of hottest 25 research articles in the subject field of Computer Science from Science Direct database. In the present study, 20 issues of the alert service beginning fr...

  4. Computational analysis of irradiation facilities at the JSI TRIGA reactor.

    Science.gov (United States)

    Snoj, Luka; Zerovnik, Gašper; Trkov, Andrej

    2012-03-01

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. PMID:22154389

  5. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  6. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  7. Computational modeling and impact analysis of textile composite structures

    Science.gov (United States)

    Hur, Hae-Kyu

    This study is devoted to the development of an integrated numerical modeling enabling one to investigate the static and the dynamic behaviors and failures of 2-D textile composite as well as 3-D orthogonal woven composite structures weakened by cracks and subjected to static-, impact- and ballistic-type loads. As more complicated modeling about textile composite structures is introduced, some of homogenization schemes, geometrical modeling and crack propagations become more difficult problems to solve. To overcome these problems, this study presents effective mesh-generation schemes, homogenization modeling based on a repeating unit cell and sinusoidal functions, and also a cohesive element to study micro-crack shapes. This proposed research has two: (1) studying behavior of textile composites under static loads, (2) studying dynamic responses of these textile composite structures subjected to the transient/ballistic loading. In the first part, efficient homogenization schemes are suggested to show the influence of textile architectures on mechanical characteristics considering the micro modeling of repeating unit cell. Furthermore, the structures of multi-layered or multi-phase composites combined with different laminar such as a sub-laminate, are considered to find the mechanical characteristics. A simple progressive failure mechanism for the textile composites is also presented. In the second part, this study focuses on three main phenomena to solve the dynamic problems: micro-crack shapes, textile architectures and textile effective moduli. To obtain a good solutions of the dynamic problems, this research attempts to use four approaches: (I) determination of governing equations via a three-level hierarchy: micro-mechanical unit cell analysis, layer-wise analysis accounting for transverse strains and stresses, and structural analysis based on anisotropic plate layers, (II) development of an efficient computational approach enabling one to perform transient

  8. Computer Models for IRIS Control System Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  9. Computer Models for IRIS Control System Transient Analysis

    International Nuclear Information System (INIS)

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  11. Analysis of high-tech methods of illegal remote computer data access

    OpenAIRE

    Polyakov, V. V.; Slobodyan, S. М.

    2007-01-01

    The analysis of high-tech methods of committing crimes in the sphere of computer information has been performed. The crimes were practically committed from remote computers. Virtual traces left at realisation of such methods are revealed. Specific proposals in investigation and prevention of the given type computer entry are developed.

  12. The methods and computer structures for adaptive Fourier descriptive image analysis

    OpenAIRE

    V.Perzhu; A. Gurau

    1997-01-01

    New architectures of image processing computer systems, based on the algorithms of Fourier - descriptive (FD) analysis have been developed. A new computing processes organisation method on the basis of FD image features has been proposed. The structures of two problem-oriented optical-electronic computer systems have been developed. The estimation of time expenditures in the systems have been carried out.

  13. Computational Experiment for the Analysis of Functioning of Technological Process of Filtering of a Suspension

    Directory of Open Access Journals (Sweden)

    Gulnora Shermatova

    2012-02-01

    Full Text Available For the Analysis of Functioning of Technological Process of Filtering of a Suspension the adequate mathematical model is developed, numerical algorithm and the computing experiments on the computer are carried out. The results of computing experiments are illustrated as the diagrams.

  14. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  15. High throughput computing: a solution for scientific analysis

    Science.gov (United States)

    O'Donnell, M.

    2011-01-01

    Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:

  16. Interface design of VSOP'94 computer code for safety analysis

    International Nuclear Information System (INIS)

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects

  17. Analysis of Drafting Effects in Swimming Using Computational Fluid Dynamics

    Science.gov (United States)

    Silva, António José; Rouboa, Abel; Moreira, António; Reis, Victor Machado; Alves, Francisco; Vilas-Boas, João Paulo; Marinho, Daniel Almeida

    2008-01-01

    The purpose of this study was to determine the effect of drafting distance on the drag coefficient in swimming. A k-epsilon turbulent model was implemented in the commercial code Fluent® and applied to the fluid flow around two swimmers in a drafting situation. Numerical simulations were conducted for various distances between swimmers (0.5-8.0 m) and swimming velocities (1.6-2.0 m.s-1). Drag coefficient (Cd) was computed for each one of the distances and velocities. We found that the drag coefficient of the leading swimmer decreased as the flow velocity increased. The relative drag coefficient of the back swimmer was lower (about 56% of the leading swimmer) for the smallest inter-swimmer distance (0.5 m). This value increased progressively until the distance between swimmers reached 6.0 m, where the relative drag coefficient of the back swimmer was about 84% of the leading swimmer. The results indicated that the Cd of the back swimmer was equal to that of the leading swimmer at distances ranging from 6.45 to 8. 90 m. We conclude that these distances allow the swimmers to be in the same hydrodynamic conditions during training and competitions. Key pointsThe drag coefficient of the leading swimmer decreased as the flow velocity increased.The relative drag coefficient of the back swimmer was least (about 56% of the leading swimmer) for the smallest inter-swimmer distance (0.5 m).The drag coefficient values of both swimmers in drafting were equal to distances ranging between 6.45 m and 8.90 m, considering the different flow velocities.The numerical simulation techniques could be a good approach to enable the analysis of the fluid forces around objects in water, as it happens in swimming. PMID:24150135

  18. Computer assisted sound analysis of arteriovenous fistula in hemodialysis patients.

    Science.gov (United States)

    Malindretos, Pavlos; Liaskos, Christos; Bamidis, Panagiotis; Chryssogonidis, Ioannis; Lasaridis, Anastasios; Nikolaidis, Pavlos

    2014-02-01

    The purpose of this study was to reveal the unique sound characteristics of the bruit produced by arteriovenous fistulae (AVF), using a computerized method. An electronic stethoscope (20 Hz to 20 000 Hz sensitivity) was used, connected to a portable laptop computer. Forty prevalent hemodialysis patients participated in the study. All measurements were made with patients resting in supine position, prior to the initiation of mid-week dialysis session. Standard color Doppler technique was used to estimate blood flow. Clinical examination revealed the surface where the perceived bruit was more intense, and the recording took place at a sample rate of 22 000 Hz in WAV lossless format. Fast Fourier Transform (FFT) mathematical algorithm, was used for the sound analysis. This algorithm is particularly useful in revealing the periodicity of sound data as well as in mapping its frequency behavior and its strength. Produced frequencies were divided into 40 frequency intervals, 250 Hz apart, so that the results would be easier to plot and comprehend. The mean age of the patients was 63.5 ± 14 years; the median time on dialysis was 39.6 months (mean 1 month, max. 200 months). The mean blood flow was 857.7 ± 448.3 ml/min. The mean sound frequency was approximately 5 500 Hz ± 4 000 Hz and the median, which is also expressing the major peak of sound data, was 750 Hz, varying from 250 Hz to 10 000 Hz. A possible limitation of the study is the relatively small number of participants. PMID:24619890

  19. Interface design of VSOP'94 computer code for safety analysis

    Science.gov (United States)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  20. Synthesis, spectral, computational and thermal analysis studies of metallocefotaxime antibiotics.

    Science.gov (United States)

    Masoud, Mamdouh S; Ali, Alaa E; Elasala, Gehan S

    2015-01-01

    Cefotaxime metal complexes of Cr(III), Mn(II), Fe(III), Co(II), Ni(II), Cu(II), Zn(II), Cd(II), Hg(II) and two mixed metals complexes of (Fe,Cu) and (Fe,Ni) were synthesized and characterized by elemental analysis, IR, electronic spectra, magnetic susceptibility and ESR spectra. The studies proved that cefotaxime may act as mono, bi, tri and tetra-dentate ligand through oxygen atoms of lactam carbonyl, carboxylic or amide carbonyl groups and nitrogen atom of thiazole ring. From the magnetic measurements and electronic spectral data, octahedral structures were proposed for all complexes. Quantum chemical methods have been performed for cefotaxime to calculate charges, bond lengths, bond angles, dihedral angles, electronegativity (χ), chemical potential (μ), global hardness (η), softness (σ) and the electrophilicity index (ω). The thermal decomposition of the prepared metals complexes was studied by TGA, DTA and DSC techniques. Thermogravimetric studies revealed the presence of lattice or coordinated water molecules in all the prepared complexes. The decomposition mechanisms were suggested. The thermal decomposition of the complexes ended with the formation of metal oxides and carbon residue as a final product except in case of Hg complex, sublimation occur at the temperature range 376.5-575.0 °C so, only carbon residue was produced during thermal decomposition. The orders of chemical reactions (n) were calculated via the peak symmetry method and the activation parameters were computed from the thermal decomposition data. The geometries of complexes may be converted from Oh to Td during the thermal decomposition steps.

  1. Computational identification and analysis of novel sugarcane microRNAs

    Directory of Open Access Journals (Sweden)

    Thiebaut Flávia

    2012-07-01

    Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.

  2. Computer assisted sound analysis of arteriovenous fistula in hemodialysis patients.

    Science.gov (United States)

    Malindretos, Pavlos; Liaskos, Christos; Bamidis, Panagiotis; Chryssogonidis, Ioannis; Lasaridis, Anastasios; Nikolaidis, Pavlos

    2014-02-01

    The purpose of this study was to reveal the unique sound characteristics of the bruit produced by arteriovenous fistulae (AVF), using a computerized method. An electronic stethoscope (20 Hz to 20 000 Hz sensitivity) was used, connected to a portable laptop computer. Forty prevalent hemodialysis patients participated in the study. All measurements were made with patients resting in supine position, prior to the initiation of mid-week dialysis session. Standard color Doppler technique was used to estimate blood flow. Clinical examination revealed the surface where the perceived bruit was more intense, and the recording took place at a sample rate of 22 000 Hz in WAV lossless format. Fast Fourier Transform (FFT) mathematical algorithm, was used for the sound analysis. This algorithm is particularly useful in revealing the periodicity of sound data as well as in mapping its frequency behavior and its strength. Produced frequencies were divided into 40 frequency intervals, 250 Hz apart, so that the results would be easier to plot and comprehend. The mean age of the patients was 63.5 ± 14 years; the median time on dialysis was 39.6 months (mean 1 month, max. 200 months). The mean blood flow was 857.7 ± 448.3 ml/min. The mean sound frequency was approximately 5 500 Hz ± 4 000 Hz and the median, which is also expressing the major peak of sound data, was 750 Hz, varying from 250 Hz to 10 000 Hz. A possible limitation of the study is the relatively small number of participants.

  3. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  4. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  5. Analysis of Scheduling Algorithms in Grid Computing Environment

    Directory of Open Access Journals (Sweden)

    Farhad Soleimanian Gharehchopogh

    2013-11-01

    Full Text Available Grid Computing is the technology of dividing computer networks with different and heterogeneous resources based on distribution computing. Grid computing has no limitation due to its geographical domain and the type of undercover resources. Generally, a grid network can be considered as a series of several big branches, different kinds of microprocessors, thousands of PC computers and workstations in all over the world. The goal of grid computing is to apply available computing resources easily for complicated calculations vie sites which are distributed geographically. In another words, the least cost for many users is to support parallelism, minimize the time of task operation and so on in scientific, trade and industrial contexts. To reach the goal, it is necessary to use an efficient scheduling system as a vital part for grid environment. Generally, scheduling plays very important role in grid networks. So, selecting the type of scheduling algorithm has an important role in optimizing the reply and waiting time which involve as two important factors. As providing scheduling algorithms which can minimize tasks runtime and increase operational power has remarkable importance in these categories. In this paper, we discuss about scheduling algorithms which involve independent algorithms such as Minimum Execution Time, Minimum Completion Time, Min-min, Max-min and XSuffrage.

  6. Computer-Assisted Law Instruction: Clinical Education's Bionic Sibling

    Science.gov (United States)

    Henn, Harry G.; Platt, Robert C.

    1977-01-01

    Computer-assisted instruction (CAI), like clinical education, has considerable potential for legal training. As an initial Cornell Law School experiment, a lesson in applying different corporate statutory dividend formulations, with a cross-section of balance sheets and other financial data, was used to supplement regular class assignments.…

  7. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    Science.gov (United States)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie; Atlas Collaboration

    2014-06-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  8. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  9. A Cost-Benefit Analysis of a Campus Computing Grid

    OpenAIRE

    Smith, Preston M.

    2011-01-01

    Any major research institution has a substantial number of computer systems on its campus, often in the scale of tens of thousands. Given that a large amount of scientific computing is appropriate for execution in an opportunistic environment, a campus grid is an inexpensive way to build a powerful computational resource. What is missing, though, is a model for making an informed decision on the cost-effectives of a campus grid. In this thesis, the author describes a model for measuring the c...

  10. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    Planimetrical measurements are made to calculate the area of an entity. By digitizing the entity the planimetrical measurements may be done by computer. This computer program was developed in conjunction with a research project involving measurement of the pneumatized cell system of the temporal...... bones as seen on X-rays. By placing the X-rays on a digitizer tablet and tracing the outline of the cell system, the area was calculated by the program. The calculated data and traced images could be stored and printed. The program is written in BASIC; necessary hardware is an IBM-compatible personal...... computer, a digitizer tablet and a printer....

  11. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  12. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis."

  13. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  15. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. mlegp: statistical analysis for computer models of biological systems using R

    OpenAIRE

    Dancik, Garrett M.; Dorman, Karin S

    2008-01-01

    Summary: Gaussian processes (GPs) are flexible statistical models commonly used for predicting output from complex computer codes. As such, GPs are well suited for the analysis of computer models of biological systems, which have been traditionally difficult to analyze due to their high-dimensional, non-linear and resource-intensive nature. We describe an R package, mlegp, that fits GPs to computer model outputs and performs sensitivity analysis to identify and characterize the effects of imp...

  18. Turing machines on represented sets, a model of computation for Analysis

    OpenAIRE

    Tavana, Nazanin; Weihrauch, Klaus

    2011-01-01

    We introduce a new type of generalized Turing machines (GTMs), which are intended as a tool for the mathematician who studies computability in Analysis. In a single tape cell a GTM can store a symbol, a real number, a continuous real function or a probability measure, for example. The model is based on TTE, the representation approach for computable analysis. As a main result we prove that the functions that are computable via given representations are closed under GTM programming. This gener...

  19. Computational analysis of irradiation facilities at the JSI TRIGA reactor

    Energy Technology Data Exchange (ETDEWEB)

    Snoj, Luka, E-mail: luka.snoj@ijs.si [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Zerovnik, Gasper, E-mail: gasper.zerovnik@ijs.si [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2012-03-15

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. - Highlights: Black-Right-Pointing-Pointer TRIGA reactor at JSI suitable for irradiation under well defined conditions. Black-Right-Pointing-Pointer It features irradiation channels of different fluxes, spectra, and dimensions. Black-Right-Pointing-Pointer Computational model has been developed and experimentally verified. Black-Right-Pointing-Pointer The model used for optimization of experiments and evaluation of uncertainties.

  20. Sensitivity Analysis and Error Control for Computational Aeroelasticity Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposal is the development of a next-generation computational aeroelasticity code, suitable for real-world complex geometries, and...

  1. Spreadsheet Analysis Of Queuing In A Computer Network

    Science.gov (United States)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  2. An Analysis of Chinese Laws Against Computer Crimes

    OpenAIRE

    Hakman A. Wan; Ming-Te Lu

    1997-01-01

    An overview of the computer crime and related legislation in the People’s Republic of China is given. Relevant laws and their interpretation by Chinese legal scholars with respect to negligence, trade secrets, copyright and piracy, data protection and privacy are presented. The unique aspects of the Chinese legal system are accentuated. Due to the differences in the cultural, political, and legal environments, the PRC courts may treat some computer crimes more severely and may hand out pena...

  3. Multi-scale analysis of lung computed tomography images

    OpenAIRE

    Gori, I.; Bagagli, F.; Fantacci, M. E.; Martinez, A. Preite; Retico, A.; De Mitri, I.; Donadio, S.; Fulcheri, C.; Gargano, G; Magro, R.; Santoro, M; Stumbo, S

    2009-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on...

  4. Computational analysis of promoters and DNA-protein interactions

    OpenAIRE

    Tomovic, Andrija

    2009-01-01

    The investigation of promoter activity and DNA-protein interactions is very important for understanding many crucial cellular processes, including transcription, recombination and replication. Promoter activity and DNA-protein interactions can be studied in the lab (in vitro or in vivo) or using computational methods (in silico). Computational approaches for analysing promoters and DNA-protein interactions have become more powerful as more and more complete genome sequences, 3D...

  5. Analysis of user interfaces and interactions with computers

    OpenAIRE

    PERČIČ, JAN

    2016-01-01

    The diploma thesis is a study of evolution of user interfaces and human interaction with computers. The thesis offers an overview of examples from history and mentions people that were important for the user interface development. At the same time it examines the current interaction principles and their potential evolution in the future. The goal was to define a potential ideal user interface, but because we are using different types of computers and in different situations, conclusion was re...

  6. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  7. CAPRI: A Geometric Foundation for Computational Analysis and Design

    Science.gov (United States)

    Haimes, Robert

    2006-01-01

    CAPRI is a software building tool-kit that refers to two ideas; (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. A complete definition of the geometry and application programming interface can be found in the document CAPRI: Computational Analysis PRogramming Interface appended to this report. In summary the interface is subdivided into the following functional components: 1. Utility routines -- These routines include the initialization of CAPRI, loading CAD parts and querying the operational status as well as closing the system down. 2. Geometry data-base queries -- This group of functions allow all top level applications to figure out and get detailed information on any geometric component in the Volume definition. 3. Point queries -- These calls allow grid generators, or solvers doing node adaptation, to snap points directly onto geometric entities. 4. Calculated or geometrically derived queries -- These entry points calculate data from the geometry to aid in grid generation. 5. Boundary data routines -- This part of CAPRI allows general data to be attached to Boundaries so that the boundary conditions can be specified and stored within CAPRI s data-base. 6. Tag based routines -- This part of the API allows the specification of properties associated with either the Volume (material properties) or Boundary (surface properties) entities. 7. Geometry based interpolation routines -- This part of the API facilitates Multi-disciplinary coupling and allows zooming through Boundary Attachments. 8. Geometric creation and manipulation -- These calls facilitate constructing simple solid entities and perform the Boolean solid operations. Geometry constructed in this manner has the advantage that if the data is kept consistent with the CAD package, therefore a new design can be incorporated directly and is manufacturable. 9

  8. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  9. Nondestructive analysis of urinary calculi using micro computed tomography

    Directory of Open Access Journals (Sweden)

    Lingeman James E

    2004-12-01

    Full Text Available Abstract Background Micro computed tomography (micro CT has been shown to provide exceptionally high quality imaging of the fine structural detail within urinary calculi. We tested the idea that micro CT might also be used to identify the mineral composition of urinary stones non-destructively. Methods Micro CT x-ray attenuation values were measured for mineral that was positively identified by infrared microspectroscopy (FT-IR. To do this, human urinary stones were sectioned with a diamond wire saw. The cut surface was explored by FT-IR and regions of pure mineral were evaluated by micro CT to correlate x-ray attenuation values with mineral content. Additionally, intact stones were imaged with micro CT to visualize internal morphology and map the distribution of specific mineral components in 3-D. Results Micro CT images taken just beneath the cut surface of urinary stones showed excellent resolution of structural detail that could be correlated with structure visible in the optical image mode of FT-IR. Regions of pure mineral were not difficult to find by FT-IR for most stones and such regions could be localized on micro CT images of the cut surface. This was not true, however, for two brushite stones tested; in these, brushite was closely intermixed with calcium oxalate. Micro CT x-ray attenuation values were collected for six minerals that could be found in regions that appeared to be pure, including uric acid (3515 – 4995 micro CT attenuation units, AU, struvite (7242 – 7969 AU, cystine (8619 – 9921 AU, calcium oxalate dihydrate (13815 – 15797 AU, calcium oxalate monohydrate (16297 – 18449 AU, and hydroxyapatite (21144 – 23121 AU. These AU values did not overlap. Analysis of intact stones showed excellent resolution of structural detail and could discriminate multiple mineral types within heterogeneous stones. Conclusions Micro CT gives excellent structural detail of urinary stones, and these results demonstrate the feasibility

  10. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  11. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  12. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science. Final semiannual report, 1 April-30 September 1986

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  13. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...... structural images for heavy minerals based on irregularly sampled geochemical data. This methodology has proven useful in producing images that reflect real geological structures with potential application in mineral exploration. A method for removing loboratory-produced map-sheet patterns in spatial data...

  14. Using Computer-Assisted Instruction to Enhance Achievement of English Language Learners

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2014-01-01

    Computer-assisted instruction (CAI) in English-Language environments offer practice time, motivates students, enhance student learning, increase authentic materials that students can study, and has the potential to encourage teamwork between students. The findings from this particular study suggested that students who used computer assisted…

  15. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  16. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  17. CAE - nuclear engineering analysis on work-station computers

    International Nuclear Information System (INIS)

    Emergence of the inexpensive and widely available 32-bit-work-station computer is revolutionizing the scientific and engineering computing environment. These systems reach or exceed threshold for many midscale nuclear applications and bridge the gap between the era of expensive computing: cheap people and the era of cheap computing: expensive people. Experience at the Idaho National Engineering Laboratory (INEL) has demonstrated the efficacy of this new computer technology. For the past 1 1/2 yr, a Hewlett-Packard 9000/540 32-bit multi-user microcomputer has been used to perform many calculations typical of a nuclear design effort. This system is similar with respect to performance and memory to such work stations as the SUN-3, HP-9000/32, or the Apollo DN-3000 that are available for under $20,000 for a fully configured single-user station. The system is being used for code development, model setup and checkout, and a full range of nuclear applications. Various one- and two-dimensional discrete ordinates transport codes are used on a routine basis. These include the well-known ANISN code as well as locally developed transport models. Typical one-dimensional multigroup calculations can be executed in clock times <10 min

  18. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    CERN Document Server

    Öhman, H; The ATLAS collaboration; Hendrix, V

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. With the new cloud technologies come also new challenges, and one such is the contextualization of cloud resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible, which precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration, dynamic resource scaling, and high degree of scalability.

  19. Computational Speed-Up of Complex Durability Analysis of Large-Scale Composite Structures

    Energy Technology Data Exchange (ETDEWEB)

    Storaasli, Olaf O [ORNL; Abdi, Frank [Alpha STAR Corporation; Dutton, Renly [Alpha Star Corporation, Long Beach CA; Cochran, Ernest J [ORNL

    2008-01-01

    The analysis of modern structures for aerospace, infrastructure, and automotive engineering applications necessitates the use of larger and larger computational models for accurate prediction of structural response. The ever-increasing size of computational structural mechanics simulations imposes a pressing need for commensurate increases in computational speed to keep costs and computation times in check. Innovative methods are needed to expedite the numerical analysis of complex structures while minimizing computational costs. The need for these methodologies is even more critical when performing durability and damage tolerance evaluation as the computation is repeated a number of times for various loading conditions. This paper describes a breakthrough for efficient and accurate predictive methodologies that are amenable to the analysis of progressive failure, reliability, and optimization of large-scale composite structures or structural components.

  20. A qualitative model for computer-assisted instruction in cardiology.

    OpenAIRE

    Julen, N.; Siregar, P.; Sinteff, J. P.; Le Beux, P.

    1998-01-01

    CARDIOLAB is an interactive computational framework dedicated to teaching and computer-aided diagnosis in cardiology. The framework embodies models that simulate the heart's electrical activity. They constitute the core of a Computer-Assisted Instruction (CAI) program intended to teach, in a multimedia environment, the concepts underlying rhythmic disorders and cardiac diseases. The framework includes a qualitative model (QM) which is described in this paper. During simulation using QM, dynam...

  1. Low-frequency computational electromagnetics for antenna analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K. (Los Alamos National Lab., NM (USA)); Burke, G.J. (Lawrence Livermore National Lab., CA (USA))

    1991-01-01

    An overview of low-frequency, computational methods for modeling the electromagnetic characteristics of antennas is presented here. The article presents a brief analytical background, and summarizes the essential ingredients of the method of moments, for numerically solving low-frequency antenna problems. Some extensions to the basic models of perfectly conducting objects in free space are also summarized, followed by a consideration of some of the same computational issues that affect model accuracy, efficiency and utility. A variety of representative computations are then presented to illustrate various modeling aspects and capabilities that are currently available. A fairly extensive bibliography is included to suggest further reference material to the reader. 90 refs., 27 figs.

  2. An Information Theoretic Analysis of Decision in Computer Chess

    CERN Document Server

    Godescu, Alexandru

    2011-01-01

    The basis of the method proposed in this article is the idea that information is one of the most important factors in strategic decisions, including decisions in computer chess and other strategy games. The model proposed in this article and the algorithm described are based on the idea of a information theoretic basis of decision in strategy games . The model generalizes and provides a mathematical justification for one of the most popular search algorithms used in leading computer chess programs, the fractional ply scheme. However, despite its success in leading computer chess applications, until now few has been published about this method. The article creates a fundamental basis for this method in the axioms of information theory, then derives the principles used in programming the search and describes mathematically the form of the coefficients. One of the most important parameters of the fractional ply search is derived from fundamental principles. Until now this coefficient has been usually handcrafted...

  3. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  4. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Martinez, A Preite; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C; Gargano, G; Magro, R; Santoro, M; Stumbo, S; 10.1088/1748-0221/2/09/P09007

    2009-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  5. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Preite Martinez, A; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C

    2007-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  6. Computational surprisal analysis speeds-up genomic characterization of cancer processes.

    Directory of Open Access Journals (Sweden)

    Nataly Kravchenko-Balasha

    Full Text Available Surprisal analysis is increasingly being applied for the examination of transcription levels in cellular processes, towards revealing inner network structures and predicting response. But to achieve its full potential, surprisal analysis should be integrated into a wider range computational tool. The purposes of this paper are to combine surprisal analysis with other important computation procedures, such as easy manipulation of the analysis results--e.g. to choose desirable result sub-sets for further inspection--, retrieval and comparison with relevant datasets from public databases, and flexible graphical displays for heuristic thinking. The whole set of computation procedures integrated into a single practical tool is what we call Computational Surprisal Analysis. This combined kind of analysis should facilitate significantly quantitative understanding of different cellular processes for researchers, including applications in proteomics and metabolomics. Beyond that, our vision is that Computational Surprisal Analysis has the potential to reach the status of a routine method of analysis for practitioners. The resolving power of Computational Surprisal Analysis is here demonstrated by its application to a variety of cellular cancer process transcription datasets, ours and from the literature. The results provide a compact biological picture of the thermodynamic significance of the leading gene expression phenotypes in every stage of the disease. For each transcript we characterize both its inherent steady state weight, its correlation with the other transcripts and its variation due to the disease. We present a dedicated website to facilitate the analysis for researchers and practitioners.

  7. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated....... This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......, the process design and conditions of operation on the polymer grade and the production rate....

  8. Sensitivity analysis of airport noise using computer simulation

    Directory of Open Access Journals (Sweden)

    Flavio Maldonado Bentes

    2011-09-01

    Full Text Available This paper presents the method to analyze the sensitivity of airport noise using computer simulation with the aid of Integrated Noise Model 7.0. The technique serves to support the selection of alternatives to better control aircraft noise, since it helps identify which areas of the noise curves experienced greater variation from changes in aircraft movements at a particular airport.

  9. An Introduction to Computer Assisted Analysis in the Biological Sciences.

    Science.gov (United States)

    Banaugh, R. P.

    This set of notes is designed to introduce the student to the development and use of computer-based models, and to analyze quantitative phenomena in the life sciences. Only BASIC programming language is used. The ten chapter titles are: The Growth of a Single Species; The Association of Two Species; Parameter Determination; Automated Parameter…

  10. Comparative Analysis of Palm and Wearable Computers for Participatory Simulations

    Science.gov (United States)

    Klopfer, Eric; Yoon, Susan; Rivas, Luz

    2004-01-01

    Recent educational computer-based technologies have offered promising lines of research that promote social constructivist learning goals, develop skills required to operate in a knowledge-based economy (Roschelle et al. 2000), and enable more authentic science-like problem-solving. In our research programme, we have been interested in combining…

  11. Introduction to Numerical Computation - analysis and Matlab illustrations

    DEFF Research Database (Denmark)

    Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun

    In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...... are illustrated by examples in MATLAB....

  12. Equilibrium analysis of the efficiency of an autonomous molecular computer

    Science.gov (United States)

    Rose, John A.; Deaton, Russell J.; Hagiya, Masami; Suyama, Akira

    2002-02-01

    In the whiplash polymerase chain reaction (WPCR), autonomous molecular computation is implemented in vitro by the recursive, self-directed polymerase extension of a mixture of DNA hairpins. Although computational efficiency is known to be reduced by a tendency for DNAs to self-inhibit by backhybridization, both the magnitude of this effect and its dependence on the reaction conditions have remained open questions. In this paper, the impact of backhybridization on WPCR efficiency is addressed by modeling the recursive extension of each strand as a Markov chain. The extension efficiency per effective polymerase-DNA encounter is then estimated within the framework of a statistical thermodynamic model. Model predictions are shown to provide close agreement with the premature halting of computation reported in a recent in vitro WPCR implementation, a particularly significant result, given that backhybridization had been discounted as the dominant error process. The scaling behavior further indicates completion times to be sufficiently long to render WPCR-based massive parallelism infeasible. A modified architecture, PNA-mediated WPCR (PWPCR) is then proposed in which the occupancy of backhybridized hairpins is reduced by targeted PNA2/DNA triplex formation. The efficiency of PWPCR is discussed using a modified form of the model developed for WPCR. Predictions indicate the PWPCR efficiency is sufficient to allow the implementation of autonomous molecular computation on a massive scale.

  13. Computational Analysis of Solvent Effects in NMR Spectroscopy.

    Science.gov (United States)

    Dračínský, Martin; Bouř, Petr

    2010-01-12

    Solvent modeling became a standard part of first principles computations of molecular properties. However, a universal solvent approach is particularly difficult for the nuclear magnetic resonance (NMR) shielding and spin-spin coupling constants that in part result from collective delocalized properties of the solute and the environment. In this work, bulk and specific solvent effects are discussed on experimental and theoretical model systems comprising solvated alanine zwitterion and chloroform molecules. Density functional theory computations performed on larger clusters indicate that standard dielectric continuum solvent models may not be sufficiently accurate. In some cases, more reasonable NMR parameters were obtained by approximation of the solvent with partial atomic charges. Combined cluster/continuum models yielded the most reasonable values of the spectroscopic parameters, provided that they are dynamically averaged. The roles of solvent polarizability, solvent shell structure, and bulk permeability were investigated. NMR shielding values caused by the macroscopic solvent magnetizability exhibited the slowest convergence with respect to the cluster size. For practical computations, however, inclusion of the first solvation sphere provided satisfactory corrections of the vacuum values. The simulations of chloroform chemical shifts and CH J-coupling constants were found to be very sensitive to the molecular dynamics model used to generate the cluster geometries. The results show that computationally efficient solvent modeling is possible and can reveal fine details of molecular structure, solvation, and dynamics. PMID:26614339

  14. A Placement Test for Computer Science: Design, Implementation, and Analysis

    Science.gov (United States)

    Nugent, Gwen; Soh, Leen-Kiat; Samal, Ashok; Lang, Jeff

    2006-01-01

    An introductory CS1 course presents problems for educators and students due to students' diverse background in programming knowledge and exposure. Students who enroll in CS1 also have different expectations and motivations. Prompted by the curricular guidelines for undergraduate programmes in computer science released in 2001 by the ACM/IEEE, and…

  15. An Analysis of Attitudes toward Computer Networks and Internet Addiction.

    Science.gov (United States)

    Tsai, Chin-Chung; Lin, Sunny S. J.

    The purpose of this study was to explore the interplay between young people's attitudes toward computer networks and Internet addiction. After analyzing questionnaire responses of an initial sample of 615 Taiwanese high school students, 78 subjects, viewed as possible Internet addicts, were selected for further explorations. It was found that…

  16. Computer program performs statistical analysis for random processes

    Science.gov (United States)

    Newberry, M. H.

    1966-01-01

    Random Vibration Analysis Program /RAVAN/ performs statistical analysis on a number of phenomena associated with flight and captive tests, but can also be used in analyzing data from many other random processes.

  17. Ultrastructural Analysis of Urinary Stones by Microfocus Computed Tomography and Comparison with Chemical Analysis

    Directory of Open Access Journals (Sweden)

    Tolga Karakan

    2016-06-01

    Full Text Available Objective: To investigate the ultra-structure of urinary system stones using micro-focus computed tomography (MCT, which makes non-destructive analysis and to compare with wet chemical analysis. Methods: This study was carried out at the Ankara Train­ing and Research hospital. Renal stones, removed from 30 patients during percutaneous nephrolithotomy (PNL surgery, were included in the study. The stones were blindly evaluated by the specialists with MCT and chemi­cal analysis. Results: The comparison of the stone components be­tween chemical analysis and MCT, showed that the rate of consistence was very low (p0.05. It was also seen that there was no significant relation between its 3D structure being heterogeneous or homogenous. Conclusion: The stone analysis with MCT is a time con­suming and costly method. This method is useful to un­derstand the mechanisms of stone formation and an im­portant guide to develop the future treatment modalities.

  18. Computational analysis of hypersonic shock wave/wall jet interaction

    Science.gov (United States)

    Mekkes, Gregory L.

    1993-01-01

    The purpose of this study is to investigate the CFD code General Aerodynamic Simulation Program (GASP) for application to a specific scramjet combustor phenomenon, that of an adverse pressure gradient caused by an oblique shock wave impinging upon a wall cooling film. The basis of this investigation is data available from an existing experimental study, which includes wall pressure, wall heat transfer, and schlieren photographs. This experimental study was conducted at a nominal Mach number of 6.0 in the Calspan 48-inch shock tunnel. The particular case of interest generates flow separation at the shock impingement point. Two algebraic turbulence models, the Baldwin-Lomax model and the Goldberg model, are considered for this computational study. Resultant computational wall pressure and heat transfer for both turbulence models are compared with experimental data. The Goldberg turbulence model provides a more accurate prediction of the recirculation region, and as a result, a better comparison with the experimental data.

  19. Analysis of Reversible Simulation of Irreversible Computation by Pebble Games

    CERN Document Server

    Li, Maozhen; Vitanyi, P; Li, Ming; Tromp, John; Vitanyi, Paul

    1998-01-01

    Reversible simulation of irreversible algorithms is analyzed in the stylized form of a `reversible' pebble game. While such simulations incur little overhead in additional computation time, they use a large amount of additional memory space during the computation. The reacheable reversible simulation instantaneous descriptions (pebble configurations) are characterized completely. As a corollary we obtain the reversible simulation by Bennett and that among all simulations that can be modelled by the pebble game, Bennett's simulation is optimal in that it uses the least auxiliary space for the greatest number of simulated steps. One can reduce the auxiliary storage overhead incurred by the reversible simulation at the cost of allowing limited erasing leading to an irreversibility-space tradeoff. We show that in this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. We show that the reversible simulation can be modified so that it is applicable also whe...

  20. Computational design and analysis of flatback airfoil wind tunnel experiment.

    Energy Technology Data Exchange (ETDEWEB)

    Mayda, Edward A. (University of California, Davis, CA); van Dam, C.P. (University of California, Davis, CA); Chao, David D. (University of California, Davis, CA); Berg, Dale E.

    2008-03-01

    A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

  1. Computational solutions to large-scale data management and analysis.

    Science.gov (United States)

    Schadt, Eric E; Linderman, Michael D; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P

    2010-09-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist - such as cloud and heterogeneous computing - to successfully tackle our big data problems.

  2. Computer-aided analysis of CCD linear image sensors

    Science.gov (United States)

    Prince, S. S.

    1976-01-01

    Special test equipment and techniques to collect and process image information from charge coupled devices (CCDs) by digital computer were reviewed. The video channel was traced from the CCD to the direct memory access bus of the Interdata Computer. Software was developed to evaluate and characterize a CCD for (1) dark signal versus temperature relationship, (2) calculation of temporal noise magnitude and noise shape for each pixel, (3) spatial noise into the video chain due to dark signal, (4) response versus illumination relationship (gamma), (5) response versus wavelength of illumination (spectral), (6) optimization of forcing functions, and (7) evaluation of an image viewed by a CCD. The basic software differences and specific examples of each program operating on real data are presented.

  3. COMCAN: a computer program for common cause analysis

    Energy Technology Data Exchange (ETDEWEB)

    Burdick, G.R.; Marshall, N.H.; Wilson, J.R.

    1976-05-01

    The computer program, COMCAN, searches the fault tree minimal cut sets for shared susceptibility to various secondary events (common causes) and common links between components. In the case of common causes, a location check may also be performed by COMCAN to determine whether barriers to the common cause exist between components. The program can locate common manufacturers of components having events in the same minimal cut set. A relative ranking scheme for secondary event susceptibility is included in the program.

  4. Marketing analysis of computer game World of Warcraft

    OpenAIRE

    Lopour, Miroslav

    2009-01-01

    This thesis describes the development of marketing, analyses the sources of change in marketing practices and tools, and shows new theories with examples from real life. To understand these methods and issues it is necessary to outline brandbuilding and current features of a customer. In the second part, all these analyses, practices and methods are being proved on the computer game World of Warcraft, which represents a wholly digital and virtual product with more than 11 mil. subscribers. Th...

  5. The role of computer networks in remote sensing data analysis

    Science.gov (United States)

    Swain, P. H.; Phillips, T. L.; Lindenlaub, J. C.

    1973-01-01

    It has been hypothesized that computer networks can be used to make data processing facilities available to the remote sensing community both quickly and effectively. An experiment to test this hypothesis is being conducted by the Laboratory for Applications of Remote Sensing at Purdue University, with the participation of potential users at several remote sites. Initial indications have been highly favorable, although final evaluation awaits further experience and the accumulation of usage data.

  6. Computational models of electromagnetic resonators: Analysis of edge element approximation

    OpenAIRE

    Boffi, Daniele; Fernandes, Paolo; Gastaldi, Lucia; Perugia, Ilaria

    1997-01-01

    The purpose of this paper is to address some difficulties which arise in computing the eigenvalues of the Maxwell's system by a finite element method. Depending on the used method, the spectrum may be polluted by spurious modes which are difficult to pick out among the approximations of the physically correct eigenvalues. Here we prove, under very general assumptions, that using edge elements the discrete spectrum well approximates the correct one and we give some justificat...

  7. Sequence motif discovery with computational genome-wide analysis

    OpenAIRE

    Akashi, Hirofumi; Aoki, Fumio; Toyota, Minoru; Maruyama, Reo; Sasaki, Yasushi; Mita, Hiroaki; Tokura, Hajime; Imai, Kohzoh; Tatsumi, Haruyuki

    2006-01-01

    As a result of the human genome project and advancements in DNA sequencing technology, we can utilize a huge amount of nucleotide sequence data and can search DNA sequence motifs in whole human genome. However, searching motifs with the naked eye is an enormous task and searching throughout the whole genome is absolutely impossible. Therefore, we have developed a computational genome-wide analyzing system for detecting DNA sequence motifs with biological significance. We used a multi-parallel...

  8. Computational techniques in gamma-ray skyshine analysis

    International Nuclear Information System (INIS)

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified to use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs

  9. Analysis of control room computers at nuclear power plants

    International Nuclear Information System (INIS)

    The following problems are analyzed: - the developing of a system - hardware and software - data - the aquisition of the system - operation and service. The findings are: - most reliability problems can be solved by doubling critical units - reliability in software has a quality that can only be created through development - reliability in computer systems in extremely unusual situations can not be quantified or verified, except possibly for very small and functionally simple systems - to attain the highest possible reliability by such simple systems these have to: - contian one or very few functions - be functionally simple - be application-transparent, viz. the internal function of the system should be independent of the status of the process - a computer system will compete succesfully with other possible systems regarding reliability for the following reasons: - if the function is simple enough for other systems, the dator system would be small - if the functions cannot be realized by other systems - the computer system would complement the human effort - and the man-machine system would be a better solution than no system, possibly better than human function only. (Aa)

  10. Computer-simulated experiments and computer games: a method of design analysis

    Directory of Open Access Journals (Sweden)

    Jerome J. Leary

    1995-12-01

    Full Text Available Through the new modularization of the undergraduate science degree at the University of Brighton, larger numbers of students are choosing to take some science modules which include an amount of laboratory practical work. Indeed, within energy studies, the fuels and combustion module, for which the computer simulations were written, has seen a fourfold increase in student numbers from twelve to around fifty. Fitting out additional laboratories with new equipment to accommodate this increase presented problems: the laboratory space did not exist; fitting out the laboratories with new equipment would involve a relatively large capital spend per student for equipment that would be used infrequently; and, because some of the experiments use inflammable liquids and gases, additional staff would be needed for laboratory supervision.

  11. Computer Simulation of Technetium Scrubbing Section of Purex Ⅰ: Computer Simulation and Technical Parameter Analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yan-xin; HE; Hui; ZHANG; Chun-long; CHANG; Li; LI; Rui-xue; TANG; Hong-bin; YU; Ting

    2012-01-01

    <正>A computer program was developed to simulate technetium scrubbing section (TcS) in Purex based on the theory of cascade extraction. The program can simulate the steady-state behavior of HNO3, U, Pu and Tc in TcS. The reliability of the program was verified by cascade extraction experiment, the relative error between calculation value and experiment value is 10% more or less except few spots. The comparison between experiment and calculation results is illustrated in Fig. 1. The technical parameters of TcS were analyzed by this program, it is found that the Decontamination factor (DFTc/U) in TcS is remarkably affected by the overall consumption (multiply molarity by volume flux) of HNO3, DFTc/U is

  12. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  13. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  14. A gravidade do trauma em vítimas de traumatismo crânio-encefálico avaliada pelo manual AIS/90 e mapas CAIS/85 La gravedad del trauma en víctimas de traumatismo cráneo-encefálico por medio del manual AIS/90 y mapas CAIS/85 Injury severity measures by AIS/90 manual and CAIS/85 chart in head injured patients

    Directory of Open Access Journals (Sweden)

    Regina Marcia Cardoso de Sousa

    1998-01-01

    Full Text Available Estudo comparativo do uso do manual da ABBREVIATED INJURY SCALE (AIS e dos mapas da CONDENSED ABBREVIATED INJURY SCALE (CAIS, como bases para cálculo do INJURY SEVERITY SCORE (ISS em vítimas de trauma crânio-encefálico. Os resultados evidenciaram que o valor do ISS foi coincidente na maioria (59,51% das vítimas passíveis de codificação pelos dois instrumentos. Quanto à indicação da faixa de gravidade do trauma (grave, moderado e leve não existiram diferenças estatisticamente significantes entre os dois instrumentos. Quanto a capacidade de cobertura da CAIS/85 para a identificação da gravidade das lesões constatou-se que a CAIS/85 permitiu a pontuação de 61,38% das lesões pontuadas com a AIS/90.Estudio comparativo del uso del Manual de la ABBREVIATED INJURY SCALE (AIS y de los mapas de la CONDENSED ABBREVIATED INJURY SCALE (CAIS, como base para el cálculo del INJURY SEVERITY SCORE (ISS en víctimas de trauma cráneo-encefálico. Los resultados mostraron que el valor del ISS coincidía en la mayoría (58,51% de las víctimas posibles de codificación por los dos instrumentos. En cuanto a la indicación de la faja de gravedad del trauma (grave, moderado y leve no existian diferencias estadísticamente significantes entre los dos instrumentos. En cuanto a la capacidad de cobertura de la CAIS/85 para la identificación de la gravedad de las lesiones, se constató que la CAIS/85 permitió la puntuación de 61,38% de las lesiones puntiadas con la AIS/90.This study was developed in order to compare the use of the ABBREVIATED INJURY SCALE (AIS and the CONDENSED ABBREVIATED INJURY SCALE (CAIS as basis to calculate INJURY SEVERITY SCORE (ISS in head injured patients. The results showed that the ISS value was equivalent in the majority of the patients (58,51% codified by both scales. Also no statistic differences between the scales were perceived when we compared the severity levels as severe, moderate and minor. 61,38% of the lesions

  15. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  16. Structural Analysis: Shape Information via Points-To Computation

    CERN Document Server

    Marron, Mark

    2012-01-01

    This paper introduces a new hybrid memory analysis, Structural Analysis, which combines an expressive shape analysis style abstract domain with efficient and simple points-to style transfer functions. Using data from empirical studies on the runtime heap structures and the programmatic idioms used in modern object-oriented languages we construct a heap analysis with the following characteristics: (1) it can express a rich set of structural, shape, and sharing properties which are not provided by a classic points-to analysis and that are useful for optimization and error detection applications (2) it uses efficient, weakly-updating, set-based transfer functions which enable the analysis to be more robust and scalable than a shape analysis and (3) it can be used as the basis for a scalable interprocedural analysis that produces precise results in practice. The analysis has been implemented for .Net bytecode and using this implementation we evaluate both the runtime cost and the precision of the results on a num...

  17. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.;

    2000-01-01

    Medical databases appear in general as collections of scarcely defined, uncomfortable feelings, disturbances and disabilities of patients encoded in medical terms and symptoms, often scarcely enriched with some ordinal and metric data. But, astonishing enough, in many cases this is sufficient......, analysis of discriminance, multiple regression and factor analysis offer the possibility to control the interdependence of data.Factor analysis has the advantage that it is based on a mathematical model, and does not ask for normal distribution of the data. Factor analysis describes the correlation of many...

  18. A revista Cais entre o protagonismo e o assistencialismo: Uma análise discursiva crítica

    Directory of Open Access Journals (Sweden)

    Viviane de Melo Resende

    2012-10-01

    Full Text Available Como parte dos resultados de um projeto integrado cujo escopo é investigar, por meio de análises discursivas, as práticas envolvidas na produção e na distribuição de cinco publicações em língua portuguesa voltadas para a situação de rua, este artigo focaliza, com base na Análise de Discurso Crítica, a revista Cais, publicada em Lisboa. Configurando‑se como jornal de rua, a revista é vendida na rua e por pessoas em situação de rua ou de risco, para as quais revertem 70% da venda de cada exemplar. Mais que um meio de comunicação e difusão de problemas sociais, acredita‑se que esse tipo de imprensa proporciona a configuração de posições e relações diferentes, podendo por isso alterar a experiência da exclusão. Neste artigo, tomando como dados excertos de uma entrevista com o seu editor, exploro em que medida se dá a participação de pessoas em situação de rua na produção da revista Cais e na representação desta mesma situação.

  19. The Effect of Prior Experience with Computers, Statistical Self-Efficacy, and Computer Anxiety on Students' Achievement in an Introductory Statistics Course: A Partial Least Squares Path Analysis

    Science.gov (United States)

    Abd-El-Fattah, Sabry M.

    2005-01-01

    A Partial Least Squares Path Analysis technique was used to test the effect of students' prior experience with computers, statistical self-efficacy, and computer anxiety on their achievement in an introductory statistics course. Computer Anxiety Rating Scale and Current Statistics Self-Efficacy Scale were administered to a sample of 64 first-year…

  20. Proceedings: Workshop on Advanced Mathematics and Computer Science for Power Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1991-08-01

    EPRI's Office of Exploratory Research sponsors a series of workshops that explore how to apply recent advances in mathematics and computer science to the problems of the electric utility industry. In this workshop, participants identified research objectives that may significantly improve the mathematical methods and computer architecture currently used for power system analysis.

  1. Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis

    Science.gov (United States)

    Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara

    2014-01-01

    This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…

  2. A Meta-Analysis of Effectiveness Studies on Computer Technology-Supported Language Learning

    Science.gov (United States)

    Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.

    2013-01-01

    With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…

  3. Computer Aided Mass Balance Analysis for AC Electric Arc Furnace Steelmaking

    Institute of Scientific and Technical Information of China (English)

    (ü)nal Camdali; Murat Tunc

    2005-01-01

    A mass balance analysis was undertaken for liquid steel production using a computer program specially developed for the AC electric arc furnace at an important alloy steel producer in Turkey. The data obtained by using the computer program were found to be very close to the actual production ones.

  4. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  5. CAR : A MATLAB Package to Compute Correspondence Analysis with Rotations

    NARCIS (Netherlands)

    Lorenzo-Seva, Urbano; van de Velden, Michel; Kiers, Henk A.L.

    2009-01-01

    Correspondence analysis (CA) is a popular method that can be used to analyse relationships between categorical variables. Like principal component analysis, CA solutions can be rotated both orthogonally and obliquely to simple structure without affecting the total amount of explained inertia. We des

  6. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  7. Algorithm-based analysis of collective decoherence in quantum computation

    Science.gov (United States)

    Utsunomiya, Shoko; Master, Cyrus P.; Yamamoto, Yoshihisa

    2007-02-01

    In a quantum computer, qubits are often stored in identical two-level systems separated by a distance shorter than the characteristic wavelength of the reservoirs that are responsible for decoherence. In this case the collective qubit-reservoir interaction, rather than the individual qubit-reservoir interaction, may determine the decoherence properties. We study the collective decoherence behavior in between each step in certain quantum algorithms and propose a simple alternative of implementing quantum algorithms using a quantum trajectory that is close to a decoherence-free subspace that avoids unstable Dicke's superradiant states and Schrödinger's cat state.

  8. Computational Proteomics: High-throughput Analysis for Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  9. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  10. Analysis of Computer Science Communities Based on DBLP

    CERN Document Server

    Biryukov, Maria; 10.1007/978-3-642-15464-5_24

    2010-01-01

    It is popular nowadays to bring techniques from bibliometrics and scientometrics into the world of digital libraries to analyze the collaboration patterns and explore mechanisms which underlie community development. In this paper we use the DBLP data to investigate the author's scientific career and provide an in-depth exploration of some of the computer science communities. We compare them in terms of productivity, population stability and collaboration trends.Besides we use these features to compare the sets of topranked conferences with their lower ranked counterparts.

  11. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per;

    2016-01-01

    We present a fully automatic online video system, which is able to detect the behaviour of honeybees at the beehive entrance. Our monitoring system focuses on observing the honeybees as naturally as possible (i.e. without disturbing the honeybees). It is based on the Raspberry Pi that is a low...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  12. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  13. Analysis of irradiated biogenic amines by computational chemistry and spectroscopy

    International Nuclear Information System (INIS)

    Biogenic Amines (B A) are nitrogenous compounds able to cause food poisoning. In this work, we studied the tyramine, one of the most common BA present in foods by combining experimental measured IR (Infrared) and GC/MS (Gas Chromatograph / Mass Spectrometry) spectra and computational quantum chemistry. Density Functional Theory (DFT) and the Deformed Atoms in Molecules (DMA) method was used to compute the partition the electronic densities in a chemically-intuitive way and electrostatic potentials of molecule to identify the acid and basic sites. Trading pattern was irradiated using a Cs 137 radiator, and each sample was identified by IR and GC/MS. Calculated and experimental IR spectra were compared. We observed that ionizing gamma irradiation was very effective in decreasing the population of standard amine, resulting in fragments that could be rationalized through the quantum chemistry calculations. In particular, we could locate the acid and basic sites of both molecules and identify possible sites of structural weaknesses, which allowed to propose mechanistic schemes for the breaking of chemical bonds by the irradiation. Moreover, from this work we hope it will be also possible to properly choose the dose of gamma irradiation which should be provided to eliminate each type of contamination. (author)

  14. A handheld computer-aided diagnosis system and simulated analysis

    Science.gov (United States)

    Su, Mingjian; Zhang, Xuejun; Liu, Brent; Su, Kening; Louie, Ryan

    2016-03-01

    This paper describes a Computer Aided Diagnosis (CAD) system based on cellphone and distributed cluster. One of the bottlenecks in building a CAD system for clinical practice is the storage and process of mass pathology samples freely among different devices, and normal pattern matching algorithm on large scale image set is very time consuming. Distributed computation on cluster has demonstrated the ability to relieve this bottleneck. We develop a system enabling the user to compare the mass image to a dataset with feature table by sending datasets to Generic Data Handler Module in Hadoop, where the pattern recognition is undertaken for the detection of skin diseases. A single and combination retrieval algorithm to data pipeline base on Map Reduce framework is used in our system in order to make optimal choice between recognition accuracy and system cost. The profile of lesion area is drawn by doctors manually on the screen, and then uploads this pattern to the server. In our evaluation experiment, an accuracy of 75% diagnosis hit rate is obtained by testing 100 patients with skin illness. Our system has the potential help in building a novel medical image dataset by collecting large amounts of gold standard during medical diagnosis. Once the project is online, the participants are free to join and eventually an abundant sample dataset will soon be gathered enough for learning. These results demonstrate our technology is very promising and expected to be used in clinical practice.

  15. Automated iterative neutrosophic lung segmentation for image analysis in thoracic computed tomography

    OpenAIRE

    Guo, Yanhui; Zhou, Chuan; Chan, Heang-Ping; Chughtai, Aamer; Wei, Jun; Hadjiiski, Lubomir M.; Kazerooni, Ella A.

    2013-01-01

    Purpose: Lung segmentation is a fundamental step in many image analysis applications for lung diseases and abnormalities in thoracic computed tomography (CT). The authors have previously developed a lung segmentation method based on expectation-maximization (EM) analysis and morphological operations (EMM) for our computer-aided detection (CAD) system for pulmonary embolism (PE) in CT pulmonary angiography (CTPA). However, due to the large variations in pathology that may be present in thoraci...

  16. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  17. Propulsion Test Support Analysis with GPU Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The design, development and staging of tests to certify liquid rocket engines usually require high-fidelity structural, fluid and thermal support analysis. These...

  18. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  19. Risk Analysis of Business Intelligence in Cloud Computing

    OpenAIRE

    Alsufyani, Raed; Chang, Victor

    2015-01-01

    The paper discusses the issues of risk analysis of Business Intelligence on the basis of Cloud platforms. The study gives an account on various aspects of the issue such as benefits and risks, financial appliance, and a factual process of data analysis. The paper attempts to address the issue in terms of empirical knowledge as long as numerous organizations face difficulties concerning appropriate application of Business Intelligence in the Cloud environment for purposes of risk forecasting a...

  20. Performance analysis of computer installations virtual machine/370 (VM/370).

    OpenAIRE

    Lazo, Waldo Marmanillo

    1981-01-01

    Approved for public release; distribution is unlimited Highlights of the IBH 4341 and IBM 3033 AP systems are presented with emphasis on Performance aspects. An analysis of Performance of the Virtual Machine Facility 370 (va-370) is performed. The main efforts are (1) to present a methodology based on performance measurement and analysis techniques, trying to relate the trends in the data to the characteristics of the system, and thus gain an insight into what might ca...