WorldWideScience

Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Energy Technology Data Exchange (ETDEWEB)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Discourse Synthesis, Analysis and Their Application to CAI (Computer Assisted Instruction).

    Science.gov (United States)

    Su, Stanley Y. W.; Moore, Robert L.

    This paper deals with the computer's production and recognition of sentences in a connected discourse and its application to computer assisted instruction. Studies of textual properties in real discourses have been carried out at the paragraph level. The theoretical concepts of representing paragraph content in terms of (1) the factual data…

  3. Computer Assisted Instruction (CAI) in Language Teaching

    Institute of Scientific and Technical Information of China (English)

    Xin; Jing

    2015-01-01

    There are many ways to use computers for English language teaching.First of all,teachers can use them to prepare for classes.They can use a word processing program to write teaching materials and tests.They can use dictionaries,encyclopedias,et c.,available on the computer as resources to help them prepare

  4. A Study of Effectiveness of Computer Assisted Instruction (CAI) over Classroom Lecture (CRL) at ICS Level

    Science.gov (United States)

    Kaousar, Tayyeba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with classroom lecture and computer-assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypotheses of…

  5. Effects of Computer-Assisted Instruction (CAI) on 11th Graders' Attitudes to Biology and CAI and Understanding of Reproduction in Plants and Animals.

    Science.gov (United States)

    Soyibo, Kola; Hudson, Ann

    2000-01-01

    Investigates whether the use of the combination of lecture, discussion, and computer-assisted instruction (CAI) significantly improved students' attitudes toward biology and their understanding of reproduction in plants and animals. Studies grade 11 Jamaican female students (n=77) from two traditional high schools in Kingston. (Contains 19…

  6. Effectiveness of Computer Assisted Instructions (CAI) in Teaching of Mathematics at Secondary Level

    Science.gov (United States)

    Dhevakrishnan, R.; Devi, S.; Chinnaiyan, K.

    2012-09-01

    The present study was aimed at effectiveness of computer assisted instructions (CAI) in teaching of mathematics at secondary level adopted experimental method and observing the difference between (CAI) and traditional method. A sample of sixty (60) students of IX class in VVB Matriculation Higher Secondary School at Elayampalayam, Namakkal district were selected for a sample and sample was divided into two group namely experiment and control group. The experimental group consisted 30 students who were taught 'Mensurationí by the computer assisted instructions and the control groups comprising 30 students were taught by the conventional method of teaching. Data analyzed using mean, S.D. and t-test. Findings of the study clearly point out that significant increase in the mean gain scores has been found in the post test scores of the experimental group. Significant differences have been found between the control group and experimental group on post test gain scores. The experiment group, which was taught by the CAI showed better, learning. The conclusion is evident that the CAI is an effective media of instruction for teaching Mathematics at secondary students.s

  7. The Effect of the Computer Assisted Instruction (CAI on Student Attitude in Mathematics Teaching of Primary School 8th Class and Views of Students towards CAI

    Directory of Open Access Journals (Sweden)

    Tuğba Hangül

    2010-12-01

    Full Text Available The aim of this study is to research the effect of the subject of “Geometric Objects” which is included in mathematics curriculum at the eighth grade on the student attitude using computer assisted instruction (CAI and find out grade 8 primary school students’ views about the computer-assisted instruction. In this study the pre-post attitude with experimental control group design was performed. The research was done under control and experiment groups consisting of fifty-three eighth grade students who were randomly identified in the year of 2009-2010. Attitude was applied to the both groups before and at the end of teaching. The method of constructivism was applied to control the group while CAI was applied to the experiment group. After teaching, fourteen students who were randomly selected from the experimental group were interviewed. Quantitative data was analyzed using Independent Samples t-test and qualitative data was analyzed by description analyze. At the end of the study, the data put forward that teaching through CAI improves the students’ attitudes positively than the method of Constructivism and students have positive opinions on CAI.

  8. Computer Assisted Instruction (CAI): A Partner for PI?

    Science.gov (United States)

    Edwards, John S.; Tillman, Murray

    1982-01-01

    Discusses differences between computer-delivered instruction and print-delivered instruction and the importance of the role of the instructional design process when adapting traditional teaching materials to newer media. The use of authoring systems for preparing materials and computer-managed instruction as a support for programed instruction are…

  9. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (RL) for Computer Science at ICS Level

    Science.gov (United States)

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  10. OE CAI: COMPUTER-ASSISTED INSTRUCTION OF OLD ENGLISH

    Directory of Open Access Journals (Sweden)

    Alejandro Alcaraz Sintes

    2002-06-01

    Full Text Available This article offer a general but thorougli survey of Computer Assisted lnstruction as applied to the Old English language íkoni the work of the late 80's pioneers to December 2001. It enibraces all the different facets of the question: stand-alone and web-based applications, Internet sites. CD-ROMs, grammars, dictioriaries, general courses, reading software, extralinguistic material, exercises, handouts, audio files ... Each instruction itee whether it be a website, a java exercise, an online course or an electronic book- is reviewed and URLs are provided in Sootiiotes. These reviews are accompanied all throughout by the pertinent theoretical background and practical advice.

  11. Competition in Individualized CAI.

    Science.gov (United States)

    Hativa, Nira; And Others

    1993-01-01

    Examines the effects of competition and cooperation on learning through computer-assisted instruction (CAI). A questionnaire was administered to 457 Israeli fourth graders who used two CAI arithmetic systems. The characteristics of the systems are discussed, and the results of the survey are correlated to students' gender and achievement levels.…

  12. Teaching Critical Thinking Skills with CAI: A Design by Two Researchers Shows Computers Can Make a Difference.

    Science.gov (United States)

    Bass, George M., Jr.; Perkins, Harvey W.

    1984-01-01

    Describes a project which involved designing a nine-week course utilizing computer assisted instruction (CAI) to teach seventh graders critical thinking skills. Results indicate measurable gains were made in the critical thinking skills of verbal analogy and inductive/deductive reasoning, although no consistent gains were made in logical reasoning…

  13. Personality preference influences medical student use of specific computer-aided instruction (CAI

    Directory of Open Access Journals (Sweden)

    Halsey Martha

    2006-02-01

    Full Text Available Abstract Background The objective of this study was to test the hypothesis that personality preference, which can be related to learning style, influences individual utilization of CAI applications developed specifically for the undergraduate medical curriculum. Methods Personality preferences of students were obtained using the Myers-Briggs Type Indicator (MBTI test. CAI utilization for individual students was collected from entry logs for two different web-based applications (a discussion forum and a tutorial used in the basic science course on human anatomy. Individual login data were sorted by personality preference and the data statistically analyzed by 2-way mixed ANOVA and correlation. Results There was a wide discrepancy in the level and pattern of student use of both CAI. Although individual use of both CAI was positively correlated irrespective of MBTI preference, students with a "Sensing" preference tended to use both CAI applications more than the "iNtuitives". Differences in the level of use of these CAI applications (i.e., higher use of discussion forum vs. a tutorial were also found for the "Perceiving/Judging" dimension. Conclusion We conclude that personality/learning preferences of individual students influence their use of CAI in the medical curriculum.

  14. Performance and analysis of a 4-stroke multi-cylinder gasoline engine with CAI combustion

    OpenAIRE

    Zhao, H.; Li, J; Ma, T.; Ladommatos, N

    2002-01-01

    Copyright © 2002 SAE International. This paper is posted on this site with permission from SAE International. Further use of this paper is not permitted without permission from SAE Controlled Auto-Ignition (CAI) combustion was realised in a production type 4-stroke 4-cylinder gasoline engine without intake charge heating or increasing compression ratio. The CAI engine operation was achieved using substantially standard components modified only in camshafts to restrict the gas exchange proc...

  15. Computer Aided Instruction (CAI) for the Shipboard Nontactical ADP Program (SNAP). Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Duncan, L.D.; Hammons, C.E.; Hume, R.; Christian, J.; Handler B.H.; Phillips, J.

    1986-01-01

    Oak Ridge National Laboratory is developing a prototype computer aided instruction package for the Navy Management Systems Support Office. This report discusses the background of the project and the progress to date including a description of the software design, problems encountered, solutions found, and recommendations. The objective of this project is to provide a prototype that will enhance training and can be used as a shipboard refresher and retraining tool. The prototype system will be installed onboard ships where Navy personnel will have ready access to the training. The subsequent testing and evaluation of the prototype could provide the basis for a Navy-wide effort to implement computer aided instruction. The work to date has followed a rigorous structured analysis methodology based on the Yourdon/DeMarco techniques. A set of data flow diagrams and a data dictionary are included in the appendices. The problems encountered revolve around requirements to use existing hardware, software, and programmer capabilities for development, implementation, and maintenance of the instructional software. Solutions have been developed which will allow the software to exist in the given environment and still provide advanced features not available in commercial courses.

  16. The Vibrio cholerae quorum-sensing autoinducer CAI-1: analysis of the biosynthetic enzyme CqsA

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, R.; Bolitho, M; Higgins, D; Lu, W; Ng, W; Jeffrey, P; Rabinowitz, J; Semmelhack, M; Hughson, F; Bassler, B

    2009-01-01

    Vibrio cholerae, the bacterium that causes the disease cholera, controls virulence factor production and biofilm development in response to two extracellular quorum-sensing molecules, called autoinducers. The strongest autoinducer, called CAI-1 (for cholera autoinducer-1), was previously identified as (S)-3-hydroxytridecan-4-one. Biosynthesis of CAI-1 requires the enzyme CqsA. Here, we determine the CqsA reaction mechanism, identify the CqsA substrates as (S)-2-aminobutyrate and decanoyl coenzyme A, and demonstrate that the product of the reaction is 3-aminotridecan-4-one, dubbed amino-CAI-1. CqsA produces amino-CAI-1 by a pyridoxal phosphate-dependent acyl-CoA transferase reaction. Amino-CAI-1 is converted to CAI-1 in a subsequent step via a CqsA-independent mechanism. Consistent with this, we find cells release {ge}100 times more CAI-1 than amino-CAI-1. Nonetheless, V. cholerae responds to amino-CAI-1 as well as CAI-1, whereas other CAI-1 variants do not elicit a quorum-sensing response. Thus, both CAI-1 and amino-CAI-1 have potential as lead molecules in the development of an anticholera treatment.

  17. Timing Students' Answers in CAI.

    Science.gov (United States)

    Hativa, Nira; And Others

    1991-01-01

    Discussion of limiting response time for students' answers focuses on a study of Israeli elementary students that investigated the effects on their performance of increasing the response time in computer-assisted instruction (CAI) for arithmetic drill and practice. Effects on high- versus low-aptitude students, and younger versus older, are…

  18. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Directory of Open Access Journals (Sweden)

    Hsin-Yih Shyu

    1995-09-01

    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  19. Numerical simulation and validation of SI-CAI hybrid combustion in a CAI/HCCI gasoline engine

    Science.gov (United States)

    Wang, Xinyan; Xie, Hui; Xie, Liyan; Zhang, Lianfang; Li, Le; Chen, Tao; Zhao, Hua

    2013-02-01

    SI-CAI hybrid combustion, also known as spark-assisted compression ignition (SACI), is a promising concept to extend the operating range of CAI (Controlled Auto-Ignition) and achieve the smooth transition between spark ignition (SI) and CAI in the gasoline engine. In this study, a SI-CAI hybrid combustion model (HCM) has been constructed on the basis of the 3-Zones Extended Coherent Flame Model (ECFM3Z). An ignition model is included to initiate the ECFM3Z calculation and induce the flame propagation. In order to precisely depict the subsequent auto-ignition process of the unburned fuel and air mixture independently after the initiation of flame propagation, the tabulated chemistry concept is adopted to describe the auto-ignition chemistry. The methodology for extracting tabulated parameters from the chemical kinetics calculations is developed so that both cool flame reactions and main auto-ignition combustion can be well captured under a wider range of thermodynamic conditions. The SI-CAI hybrid combustion model (HCM) is then applied in the three-dimensional computational fluid dynamics (3-D CFD) engine simulation. The simulation results are compared with the experimental data obtained from a single cylinder VVA engine. The detailed analysis of the simulations demonstrates that the SI-CAI hybrid combustion process is characterised with the early flame propagation and subsequent multi-site auto-ignition around the main flame front, which is consistent with the optical results reported by other researchers. Besides, the systematic study of the in-cylinder condition reveals the influence mechanism of the early flame propagation on the subsequent auto-ignition.

  20. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Directory of Open Access Journals (Sweden)

    Sunny S. J. Lin

    1988-03-01

    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  1. Effect of CAI on Achievement of LD Students in English

    Science.gov (United States)

    Sivaram, R. T.; Ramar, R.

    2014-01-01

    The present experimental study was undertaken with three objectives in view, (i) to identify students with language learning disabilities (ii) to develop CAI software to teach LD students through computer-assisted instruction and (iii) to measure the effectiveness of CAI with special reference to LD students. Two matched groups of LD students were…

  2. A Pilot CAI Scheme for the Malaysian Secondary Education System.

    Science.gov (United States)

    Rao, A. Kanakaratnam; Rao, G. S.

    1982-01-01

    A multi-phase computer aided instruction (CAI) scheme for Malaysian Secondary Schools and Matriculation Centres attached to local universities is presented as an aid for improving instruction and for solving some problems presently faced by the Malaysian Secondary Education System. Some approaches for successful implementation of a CAI scheme are…

  3. Marshall McLuhan and the Case Against CAI.

    Science.gov (United States)

    Hirvela, Alan

    1988-01-01

    Presents some of the conventional arguments against computer assisted instruction (CAI) in language education and explores humanistic concerns raised in the works of Marshall McLuhan. It is concluded that CAI is introduced into the instructional process before proper research has demonstrated that this method of teaching is not harmful for…

  4. The Relevance of AI Research to CAI.

    Science.gov (United States)

    Kearsley, Greg P.

    This article provides a tutorial introduction to Artificial Intelligence (AI) research for those involved in Computer Assisted Instruction (CAI). The general theme is that much of the current work in AI, particularly in the areas of natural language understanding systems, rule induction, programming languages, and socratic systems, has important…

  5. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Natural gas diffusion through the cap rock is mainly by means ofdissolving in water, so its concentration can be replaced by solubility, which varies with temperature, pressure and salinity in strata. Under certain geological conditions the maximal solubility is definite, so the diffusion com-putation can be handled approximately by stable state equation. Furthermore, on the basis of the restoration of the paleo-buried history, the diffusion is calculated with the dynamic method, and the result is very close to the real diffusion value in the geological history.

  6. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    FANG; Dequan; (

    2001-01-01

    preferred orientation of experimentally deformed quartzites, Geol. Soc. Am. Bull., 1973, 8: 297.[13]Ramsay, J. G., Huber, M., The Techniques of Modern Structural Geology, Vol. 1, Strain Analysis, New York: Academic Press, 1983, 73-124.[14]Li Shuguang, Ge Ningjie, Liu Deliang et al., The Sm-Nd isotopic age of C-type eclogite from the Dabie group in the northern Dabie mountains and its tectonic implication, Chinese Science Bulletin, 1989, 34(19): 1625.[15]Ye Bodan, Jian Ping, Xu Junwen et al., The Sujiahe Terrene Collage Belt and Its Constitution and Evolution Along the North Hillslope of the Tongbai-Dabie Orogenic Belt (in Chinese), Wuhan: China University of Geosciences Press, 1993, 1-69.[16]Jian Ping, Yan Weiran, Li Zhchang et al., Isotopic geochronological evidence for the Caledonian Xiongdian eclogite in the western Dabie mountains, China, Acta Geologica Sinica (in Chinese), 1997, 71(2): 133.[17]Liu Zhigang, Niu Baogui, Fu Yunlian et al., The tectonostratigraphic units at the northern foot of the Dabie mountains, Regional Geology of China (in Chinese), 1994, 13(1): 246.[18]Zhai Xiaoming, Day, H. W., Hacker, B. R. et al., Paleozoic metamorphism in the Qinling orogen, Tongbai Mountain, central China, Geology, 1998, 26: 371.[19]Li, S., Jagoutz., E., Xiao, Y. et al., Chronology of ultrahigh-pressure metamorphism in the Dabie Mountains and Su-Lu terrene: I. Sm-Nd isotope system, Science in China, Ser. D, 1996, 39(6): 597.[20]Zhang, Z., You, Z., Han, Y. et al., Petrology metamorphic process and genesis of the Dabie-Sulu eclogite belt, east-central China, Acta Geologica Sinica, 1995, 96(2): 306.[21]Cong Bolin, Wang Qingchen, The Dabie-Sulu UHP rocks belt: review and prospect, Chinese Science Bulletin, 1999, 44(12): 1074.[22]Xu Shutong, Jiang laili, Liu Yican et al., Tectonic framework and evolution of the Dabie mountains in Anhui, eastern China, Acta Geologica Sinica (in Chinese), 1992, 66(1): 1.[23]Ren Jishun, Niu Baogui, Liu Zhigang

  7. Study on Teaching Strategies in Mathematics Education based on CAI

    Directory of Open Access Journals (Sweden)

    Wei Yan Feng

    2016-01-01

    Full Text Available With the development of information technology and the popularization of internet, mobile phone, new media represented is gradually influencing and changing people’s study and life, become the centre and social consensus of cultural information, according to the China Internet Network Information centre, the youth is the main use of CAI(Computer Assisted Instruction, which is the most active group of customers, fully understand the impact of the new media environment for students, higher mathematics education of college students in CAI. In this paper, the CAI is proposed for mathematics education of college students.

  8. Maxi CAI with a Micro.

    Science.gov (United States)

    Gerhold, George; And Others

    This paper describes an effective microprocessor-based CAI system which has been repeatedly tested by a large number of students and edited accordingly. Tasks not suitable for microprocessor based systems (authoring, testing, and debugging) were handled on larger multi-terminal systems. This approach requires that the CAI language used on the…

  9. USING COMPUTERS IN EDUCATION--SOME PROBLEMS AND SOLUTIONS.

    Science.gov (United States)

    SILBERMAN, HARRY F.

    POSSIBLE SOLUTIONS TO THE PROBLEM OF THE DESIGN OF COMPUTER-ASSISTED INSTRUCTION (CAI) PROGRAMS ARE TO COPY EXISTING METHODS, TO USE SCIENTIFIC METHODS, OR TO DESIGN PROGRAMS FITTED TO LOCAL NEEDS. THE BEST ANSWER TO THE PROBLEM OF INSTRUCTIONAL MANAGEMENT SYSTEMS NEEDED FOR CAI PROGRAMS IS COMPUTER ANALYSIS OF STUDENT PERFORMANCE DATA. TRAINING…

  10. The Graphics Terminal Display System; a Powerful General-Purpose CAI Package.

    Science.gov (United States)

    Hornbeck, Frederick W., Brock, Lynn

    The Graphic Terminal Display System (GTDS) was created to support research and development in computer-assisted instruction (CAI). The system uses an IBM 360/50 computer and interfaces with a large-screen graphics display terminal, a random-access slide projector, and a speech synthesizer. An authoring language, GRAIL, was developed for CAI, and…

  11. Alternative communication network designs for an operational Plato 4 CAI system

    Science.gov (United States)

    Mobley, R. E., Jr.; Eastwood, L. F., Jr.

    1975-01-01

    The cost of alternative communications networks for the dissemination of PLATO IV computer-aided instruction (CAI) was studied. Four communication techniques are compared: leased telephone lines, satellite communication, UHF TV, and low-power microwave radio. For each network design, costs per student contact hour are computed. These costs are derived as functions of student population density, a parameter which can be calculated from census data for one potential market for CAI, the public primary and secondary schools. Calculating costs in this way allows one to determine which of the four communications alternatives can serve this market least expensively for any given area in the U.S. The analysis indicates that radio distribution techniques are cost optimum over a wide range of conditions.

  12. Student Conceptions of, and Attitudes toward, Specific Features of a CAI System.

    Science.gov (United States)

    Hativa, Nira

    1989-01-01

    Describes study of Israeli elementary school students that examined student attitudes toward computer-assisted instruction (CAI) designed to provide drill and practice in arithmetic. Attitudes are compared in relation to students' aptitude, gender, grade level, and socioeconomic status, and implications for the design of CAI systems are discussed.…

  13. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  14. Significant efficiency findings while controlling for the frequent confounders of CAI research in the PlanAlyzer project's computer-based, self-paced, case-based programs in anemia and chest pain diagnosis.

    Science.gov (United States)

    Lyon, H C; Healy, J C; Bell, J R; O'Donnell, J F; Shultz, E K; Wigton, R S; Hirai, F; Beck, J R

    1991-04-01

    Richard E. Clark in his widely published comprehensive studies and meta-analyses of the literature on computer assisted instruction (CAI) has decried the lack of carefully controlled research, challenging almost every study which shows the computer-based intervention to result in significant post-test proficiency gains over a non-computer-based intervention. We report on a randomized study in a medical school setting where the usual confounders found by Clark to plague most research, were carefully controlled. PlanAlyzer is a microcomputer-based, self-paced, case-based, event-driven system for medical education which was developed and used in carefully controlled trials in a second year medical school curriculum to test the hypothesis that students with access to the interactive programs could integrate their didactic knowledge more effectively and/or efficiently than with access only to traditional textual "nonintelligent" materials. PlanAlyzer presents cases, elicits and critiques a student's approach to the diagnosis of two common medical disorders: anemias and chest pain. PlanAlyzer uses text, hypertext, images and critiquing theory. Students were randomized, one half becoming the experimental group who received the interactive PlanAlyzer cases in anemia, the other half becoming the controls who received the exact same content material in a text format. Later in each year there was a crossover, the controls becoming the experimentals for a similar intervention with the cardiology PlanAlyzer cases. Preliminary results at the end of the first two full trials shows that the programs have achieved most of the proposed instructional objectives, plus some significant efficiency and economy gains. 96 faculty hours of classroom time were saved by using PlanAlyzer in their place, while maintaining high student achievement. In terms of student proficiency and efficiency, the 328 students in the trials over two years were able to accomplish the project's instructional

  15. Text analysis and computers

    OpenAIRE

    1995-01-01

    Content: Erhard Mergenthaler: Computer-assisted content analysis (3-32); Udo Kelle: Computer-aided qualitative data analysis: an overview (33-63); Christian Mair: Machine-readable text corpora and the linguistic description of danguages (64-75); Jürgen Krause: Principles of content analysis for information retrieval systems (76-99); Conference Abstracts (100-131).

  16. The Interplay between Different Forms of CAI and Students' Preferences of Learning Environment in the Secondary Science Class

    Science.gov (United States)

    Chang, Chun-Yen; Tsai, Chin-Chung

    2005-01-01

    This evaluation study investigated the effects of a teacher-centered versus student-centered computer-assisted instruction (CAI) on 10th graders' earth science student learning outcomes. This study also explored whether the effects of different forms of computer-assisted instruction (CAI) on student learning outcomes were influenced by student…

  17. Computational Music Analysis

    DEFF Research Database (Denmark)

    methodological issues, harmonic and pitch-class set analysis, form and voice-separation, grammars and hierarchical reduction, motivic analysis and pattern discovery and, finally, classification and the discovery of distinctive patterns. As a detailed and up-to-date picture of current research in computational...

  18. The Matriculation Science Curriculum of the USM in the Context of the PPI and CAI Modes of Instruction.

    Science.gov (United States)

    Cheng, Chuah Chong; Seng, Chin Pin

    1985-01-01

    Discusses philosophy, aims and objectives, and structure of the Matriculation Science Curriculum of the University Sains Malaysia. Includes comments on instructional strategies, individualized learning, programmed instruction, systems approach to computer-assisted instruction (CAI) implementation, CAI authoring system, and various program…

  19. CAI and the Development of Automaticity in Mathematics Skills in Students with and without Mild Mental Handicaps.

    Science.gov (United States)

    Lin, Agnes; And Others

    1994-01-01

    Describes a study that compared computer-assisted instruction (CAI) and a more traditional paper-and-pencil method in teaching mathematics skills to elementary school students with and without mild mental handicaps. Pretests and posttests were analyzed, and it was found that CAI, and extended practice, enhanced automatization of mathematics…

  20. A unified framework for producing CAI melting, Wark-Lovering rims and bowl-shaped CAIs

    Science.gov (United States)

    Liffman, Kurt; Cuello, Nicolas; Paterson, David A.

    2016-10-01

    Calcium-Aluminium inclusions (CAIs) formed in the Solar system, some 4567 million years ago. CAIs are almost always surrounded by Wark-Lovering rims (WLRs), which are a sequence of thin, mono/bi-mineralic layers of refractory minerals, with a total thickness in the range of 1-100 microns. Recently, some CAIs have been found that have tektite-like bowl-shapes. To form such shapes, the CAI must have travelled through a rarefied gas at hypersonic speeds. We show how CAIs may have been ejected from the inner solar accretion disc via the centrifugal interaction between the solar magnetosphere and the inner disc rim. They subsequently punched through the hot, inner disc rim wall at hypersonic speeds. This re-entry heating partially or completely evaporated the CAIs. Such evaporation could have significantly increased the metal abundances of the inner disc rim. High speed movement through the inner disc produced WLRs. To match the observed thickness of WLRs required metal abundances at the inner disc wall that are of order 10 times that of standard solar abundances. The CAIs cooled as they moved away from the protosun, the deduced CAI cooling rates are consistent with the CAI cooling rates obtained from experiment and observation. The speeds and gas densities required to form bowl-shaped CAIs are also consistent with the expected speeds and gas densities for larger, ˜1 cm, CAIs punching through an inner accretion disc wall.

  1. Study on Teaching Strategies in Mathematics Education based on CAI

    OpenAIRE

    Wei Yan Feng

    2016-01-01

    With the development of information technology and the popularization of internet, mobile phone, new media represented is gradually influencing and changing people’s study and life, become the centre and social consensus of cultural information, according to the China Internet Network Information centre, the youth is the main use of CAI(Computer Assisted Instruction), which is the most active group of customers, fully understand the impact of the new media environment for students, higher mat...

  2. Analysis of Visual Basic and physical design of multimedia CAI courseware%解析Visual Basic与体育多媒体CAI课件设计

    Institute of Scientific and Technical Information of China (English)

    张雷

    2014-01-01

    Visual Basic(VB)语言是一门专业的软件开发工具,应用VB开发体育多媒体CAI软件能够有效提升课件的水准。当前体育多媒体CAI课件已经与VB密不可分,用VB开发工具来提升多媒体课件的水平已经成为未来发展的必然趋势。本文将重点探讨VB如何在课件中应用。%Visual Basic (VB) language is a professional software development tools,the application of VB in the development of Sports Multimedia CAI software can effectively improve the level of courseware. The current sports multimedia CAI courseware with VB are inseparable,to enhance the multimedia courseware development tools with VB level has become the inevitable trend of future development.This paper will focus on how to use VB in the courseware.

  3. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  4. 试析Action Script下的电子技术实验CAI制作%Analysis of electronic technology experiment CAI under Action Script

    Institute of Scientific and Technical Information of China (English)

    赵莎莎

    2012-01-01

    Action Script 是一种动作脚本编程语言,用这种脚本语言编写出来的程序具有交互性强、教据处理能力强的特点,这就使得用户能够方便且容易懂得用Action Script所呈现出来的内容.鉴于动作脚本语言这样的优点,常在计算机辅助教学(CAI)中使用Action Script,尤其是在电子技术试验CAI制作方面.本文将就如何在电子技术实验计算机辅助教学制作方面运用Action Script提出一些建议.

  5. 蔡邕“从卓致死”本事刍议%Analysis on Cai Yong's Death as a Follower of Dong Zhuo

    Institute of Scientific and Technical Information of China (English)

    陈海燕

    2011-01-01

    Dong Zhuo led his force into the capital,starting a period of warlord chaos and political instability in late Han Dynasty.Undoubtedly,as a cruel dictator,Dong Zhuo's death does not deserve a single pity.But his grace to Cai Yong and his talent has brought%董卓带兵入京,开启了汉末政坛军阀混战、动荡不安的局面。毫无疑问,作为一介残忍暴虐的武夫,他死不足惜,但他对蔡邕的恩遇和重用,却造就了蔡邕的悲剧命运。随着对董卓之死所发出的一声叹息,蔡邕被王允斥为附逆而下狱,最终死于非命。蔡邕之死,历来论者纷纭,观点不一。本文尝试结合汉末的政治环境及蔡邕的人生经历对他所造成的影响等方面,对蔡邕"从卓被杀"的原因进行多方面的分析。

  6. 蔡英文两岸论述解析%The Analysis of the Discourse of Cai Ying-wen on Cross-strait Relations

    Institute of Scientific and Technical Information of China (English)

    倪永杰

    2011-01-01

    民进党蔡英文的政策主张引起各方关注,其两岸论述既是选举策略,也反映了民进党当前的两岸思维与政策逻辑,对下阶段两岸关系的发展产生复杂影响。围绕2011年2月至6月间蔡英文公开发表的两岸言论进行分析,蔡的两岸论述在坚守"台独"内核的同时,持续进行策略调整,提出不少富有想象力的口号,但仍无法摆脱论述困境,陷于旧框架,缺乏新思维,两岸论述仍是民进党的重大"罩门"。%The policy advocated by Cai Ying-wen of Democratic Progressive Party attracted people's attention.Her discourse on cross-strait relations is not only her campaign strategy but also reflects the current logic thinking and policy of Democratic Progressive Party.It will have intricate influence on the development of cross-strait relations of the next stage.This article stresses and analyzes her two published speeches on cross-strait relations during Feb.to June of 2011.Although her discourse adheres to the kernel of "Taiwan independence" while continuing to undertake the adjustment strategy and bringing forward some imaginative slogans,she cannot get rid of the dilemma of her own discourse and it is immersed in the old framework of lacking fresh thought.Her cross-strait discourse is still the major "hood door" for Democratic Progressive Party.

  7. New breast cancer prognostic factors identified by computer-aided image analysis of HE stained histopathology images.

    Science.gov (United States)

    Chen, Jia-Mei; Qu, Ai-Ping; Wang, Lin-Wei; Yuan, Jing-Ping; Yang, Fang; Xiang, Qing-Ming; Maskey, Ninu; Yang, Gui-Fang; Liu, Juan; Li, Yan

    2015-05-29

    Computer-aided image analysis (CAI) can help objectively quantify morphologic features of hematoxylin-eosin (HE) histopathology images and provide potentially useful prognostic information on breast cancer. We performed a CAI workflow on 1,150 HE images from 230 patients with invasive ductal carcinoma (IDC) of the breast. We used a pixel-wise support vector machine classifier for tumor nests (TNs)-stroma segmentation, and a marker-controlled watershed algorithm for nuclei segmentation. 730 morphologic parameters were extracted after segmentation, and 12 parameters identified by Kaplan-Meier analysis were significantly associated with 8-year disease free survival (P < 0.05 for all). Moreover, four image features including TNs feature (HR 1.327, 95%CI [1.001-1.759], P = 0.049), TNs cell nuclei feature (HR 0.729, 95%CI [0.537-0.989], P = 0.042), TNs cell density (HR 1.625, 95%CI [1.177-2.244], P = 0.003), and stromal cell structure feature (HR 1.596, 95%CI [1.142-2.229], P = 0.006) were identified by multivariate Cox proportional hazards model to be new independent prognostic factors. The results indicated that CAI can assist the pathologist in extracting prognostic information from HE histopathology images for IDC. The TNs feature, TNs cell nuclei feature, TNs cell density, and stromal cell structure feature could be new prognostic factors.

  8. A Unified Framework for Producing CAI Melting, Wark-Lovering Rims and Bowl-Shaped CAIs

    CERN Document Server

    Liffman, Kurt; Paterson, David A

    2016-01-01

    Calcium Aluminium Inclusions (CAIs) formed in the Solar System, some 4,567 million years ago. CAIs are almost always surrounded by Wark-Lovering Rims (WLRs), which are a sequence of thin, mono/bi-mineralic layers of refractory minerals, with a total thickness in the range of 1 to 100 microns. Recently, some CAIs have been found that have tektite-like bowl-shapes. To form such shapes, the CAI must have travelled through a rarefied gas at hypersonic speeds. We show how CAIs may have been ejected from the inner solar accretion disc via the centrifugal interaction between the solar magnetosphere and the inner disc rim. They subsequently punched through the hot, inner disc rim wall at hypersonic speeds. This re-entry heating partially or completely evaporated the CAIs. Such evaporation could have significantly increased the metal abundances of the inner disc rim. High speed movement through the inner disc produced WLRs. To match the observed thickness of WLRs required metal abundances at the inner disc wall that a...

  9. Socioeconomic Status, Aptitude, and Gender Differences in CAI Gains of Arithmetic.

    Science.gov (United States)

    Hativa, Nira; Shorer, Dvora

    1989-01-01

    A report is given of a study which examined the effects of computer-assisted instruction (CAI) in mathematics on 99 disadvantaged and 112 advantaged Israeli students. Higher performance levels and larger gains were found for advantaged over disadvantaged students, for high achievers over low achievers, and for boys over girls. (IAH)

  10. Using CAI To Enhance the Peer Acceptance of Mainstreamed Students with Mild Disabilities.

    Science.gov (United States)

    Culliver, Concetta; Obi, Sunday

    This study applied computer-assisted instruction (CAI) techniques to improve peer acceptance among 92 mainstreamed students with mild disabilities from 10 to 13 years of age. Participants in the treatment group received their generalized curriculum program (including mathematics, language arts, reading, health, social studies, and science)…

  11. Computer analysis of railcar vibrations

    Science.gov (United States)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  12. Effectiveness of Cognitive Skills-Based Computer-Assisted Instruction for Students with Disabilities: A Synthesis

    Science.gov (United States)

    Weng, Pei-Lin; Maeda, Yukiko; Bouck, Emily C.

    2014-01-01

    Computer-assisted instruction (CAI) for students with disabilities can be categorized into the following categories: visual, auditory, mobile, and cognitive skills-based CAI. Cognitive-skills based CAI differs from other types of CAI largely in terms of an emphasis on instructional design features. We conducted both systematic review of…

  13. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  14. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  15. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  16. Computational Aeroacoustic Analysis System Development

    Science.gov (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  17. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  18. 张采的叙事类散文探析--从“传状”文角度考察%Analysis of Zhang Cai's Narrative Prose From the Perspective of Biography

    Institute of Scientific and Technical Information of China (English)

    曾硕先

    2016-01-01

    Zhang Cai who was one of the two leaders of the Fushe Association , which was the most influential organization in the literary circles in the late Ming Dynasty .The narrative prose composed by Zhang Cai mainly referred to as the traditional biogra -phies of history and ancient prose .In which various literary views can be found since Wang Yangming period , and it advocated the Neo-Confucianism .As for the narrative skills , he was good at exhibiting characters by special environmental description and crea -ting a solemn, serene and simple style .The traditional biographies of history and ancient prose can represent Zhang Cai ’ s prose style.So, we can understand the ancient literary retro -movement of the Fushe Association by investigating them .%作为明末主秉文坛的复社“二张”之一,张采叙事散文的特点于思想内容上,一扫王阳明以来的文坛纷纭,恪守明经倡学,回归程朱理学范畴;在叙事技巧上,善于将人物置身于特别的场景中凸显人物性格,善于营造静穆朴实的文风。作为张采代表性的散文创作文体,“传状”文无疑是洞见复社文学复古运动的一条捷径。

  19. A Mathematical Model for Project Planning and Cost Analysis in Computer Assisted Instruction.

    Science.gov (United States)

    Fitzgerald, William F.

    Computer-assisted instruction (CAI) has become sufficiently widespread to require attention to the relationships between its costs, administration and benefits. Despite difficulties in instituting them, quantifiable cost-effectiveness analyses offer several advantages. They allow educators to specify with precision anticipated instructional loads,…

  20. Cai-Li Communication Protocol in Noisy Quantum Channel

    Institute of Scientific and Technical Information of China (English)

    L(U) Hua; YAN Xu-Dong; ZHANG Xian-Zhou

    2004-01-01

    @@ Since the original Cai-Li protocol [Chin. Phys. Lett. 21 (2004) 601] can be used only in an ideal quantum communication, we present the modified Cai-Li protocol that can be used in the a noisy quantum channel by using Calderbank-Shor-Steane (CSS) codes to correct errors. We also give a tight bound on the connection between information Eve eavesdropped with a measurement attack in line B → A and detection probability,which shows that the Cai-Li protocol can be used as a quasisecure direct quantum communication.

  1. COMPUTER ASSISTED INSTRUCTION AND ITS APPLICATION IN ENGLISH LEARNING

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    This paper briefly describes the development of computer assisted instruction(CAI) abroad and in China, lists the advantages of CAI and deals with its application in English learning. Some suggestions about how to make better use of CAI in ELT are also given.

  2. User Interface Improvements in Computer-Assisted Instruction, the Challenge.

    Science.gov (United States)

    Chalmers, P. A.

    2000-01-01

    Identifies user interface problems as they relate to computer-assisted instruction (CAI); reviews the learning theories and instructional theories related to CAI user interface; and presents potential CAI user interface improvements for research and development based on learning and instructional theory. Focuses on screen design improvements.…

  3. A CAI in the Ivuna CI1 Chondrite

    Science.gov (United States)

    Frank, David R.; Zolensky, M.; Martinez, J.; Mikouchi, T.; Ohsumi, K.; Hagiya, K.; Satake, W.; Le, L.; Ross, D.; Peslier, A.

    2011-01-01

    We have recently discovered the first well-preserved calcium aluminum-rich inclusion (CAI) in a CI1 chondrite (Ivuna). Previously, all CI1 chondrites were thought to be devoid of preserved CAI and chondrules due to the near total aqueous alteration to which their parent body (bodies) have been subjected. The CAI is roughly spherical, but with a slight teardrop geometry and a maximum diameter of 170 microns (fig. 1). It lacks any Wark-Lovering Rim. Incipient aqueous alteration, and probably shock, have rendered large portions of the CAI poorly crystalline. It is extremely fine-grained, with only a few grains exceeding 10 microns. We have performed electron microprobe analyses (EPMA), FEG-SEM imaging and element mapping, as well as electron back-scattered diffraction (EBSD) and synchrotron X-ray diffraction (SXRD) in order to determine the fundamental characteristics of this apparently unique object.

  4. A Petaflops Era Computing Analysis

    Science.gov (United States)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  5. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  6. Purification and Activity of Antibacterial Substances Derived from Soil Streptomyces sp.CaiF1

    Institute of Scientific and Technical Information of China (English)

    Hui YANG; Guixiang PENG; Jianmin ZENG; Zhiyuan TAN

    2012-01-01

    [Objective] This study aimed to separate and purify antibacterial sub- stances from soil Streptomyces sp. CaiF1, and to explore the activities of this sub- stance. [Method] The antibacterial substances were separated and purified by Ethyl acetate extraction, macroporous adsorptive resin, silica gel chromatography and preparative high performance liquid chromatography (HPLC), and powdery mildew were taken as the indicating bacterial to study their activities. [Result] Antibacterial substances were purified and the stability analysis of the extracts from Streptomyces CaiF1 fermentation broth showed very stable at pH 2.0-pH 10.0, 100 ~C and changed very little under UV treatment for 24 h. Inhibition rate of powdery mildew was 69.7%. [Conclusion] The purified antibacterial substances showed good stability, which provided theoretical foundation for their structural identifications and future ap- plications.

  7. Relationship between Pre-Service Music Teachers' Personality and Motivation for Computer-Assisted Instruction

    Science.gov (United States)

    Perkmen, Serkan; Cevik, Beste

    2010-01-01

    The main purpose of this study was to examine the relationship between pre-service music teachers' personalities and their motivation for computer-assisted music instruction (CAI). The "Big Five" Model of Personality served as the framework. Participants were 83 pre-service music teachers in Turkey. Correlation analysis revealed that three…

  8. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  9. Computer Aided Data Analysis in Sociometry

    Science.gov (United States)

    Langeheine, Rolf

    1978-01-01

    A computer program which analyzes sociometric data is presented. The SDAS program provides classical sociometric analysis. Multi-dimensional scaling and cluster analysis techniques may be combined with the MSP program. (JKS)

  10. Compupoem: CAI for Writing and Studying Poetry.

    Science.gov (United States)

    Marcus, Stephen

    1982-01-01

    Describes a computer program that prompts the user for different parts of speech and formats the words in a haiku-like poetic structure. (Available from "The Computing Teacher," Department of Computer and Information Science, University of Oregon, Eugene, OR 97403.) (AEA)

  11. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  12. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  13. Adjustment computations spatial data analysis

    CERN Document Server

    Ghilani, Charles D

    2011-01-01

    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  14. Phenotypic diversity and correlation between white-opaque switching and the CAI microsatellite locus in Candida albicans.

    Science.gov (United States)

    Hu, Jian; Guan, Guobo; Dai, Yu; Tao, Li; Zhang, Jianzhong; Li, Houmin; Huang, Guanghua

    2016-08-01

    Candida albicans is a commensal fungal pathogen that is often found as part of the human microbial flora. The aim of the present study was to establish a relationship between diverse genotypes and phenotypes of clinical isolates of C. albicans. Totally 231 clinical isolates were collected and used for genotyping and phenotypic switching analysis. Based on the microsatellite locus (CAI) genotyping assay, 65 different genotypes were identified, and some dominant types were found in certain human niches. For example, the genotypes of 30-44 and 30-45 were enriched in vaginal infection samples. C. albicans has a number of morphological forms including the single-celled yeasts, multicellular filaments, white, and opaque cell types. The relationship between the CAI genotype and the ability to undergo phenotypic switching was examined in the clinical isolates. We found that the strains with longer CAA/G repeats in both alleles of the CAI locus were more opaque competent. We also discovered that some MTL heterozygous (a/alpha) isolates could undergo white-opaque switching when grown on regular culture medium (containing glucose as the sole carbon source). Our study establishes a link between phenotypic switching and genotypes of the CAI microsatellite locus in clinical isolates of C. albicans.

  15. Cluster analysis for computer workload evaluation

    CERN Document Server

    Landau, K

    1976-01-01

    An introduction to computer workload analysis is given, showing its range of application in computer centre management, system and application programming. Cluster methods are discussed which can be used in conjunction with workload data and cluster algorithms are adapted to the specific set problem. Several samples of CDC 7600- accounting-data-collected at CERN, the European Organization for Nuclear Research-underwent a cluster analysis to determine job groups. The conclusions from resource usage of typical job groups in relation to computer workload analysis are discussed. (17 refs).

  16. Prof. Cai Shuming Receives 2005 Wetland Conservation Award

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ Prof. Cai Shuming, an expert in wetland studies from the CAS Institute of Geodesy & Geophysics, has been honored with a Ramsar Wetland Conservation Award in 2005. The announcement was made by the Standing Committee of the Ramsar Convention on June 10 in Gland,Switzerland.

  17. Error Analysis In Computational Elastodynamics

    Science.gov (United States)

    Mukherjee, Somenath; Jafarali, P.; Prathap, Gangan

    The Finite Element Method (FEM) is the mathematical tool of the engineers and scientists to determine approximate solutions, in a discretised sense, of the concerned differential equations, which are not always amenable to closed form solutions. In this presentation, the mathematical aspects of this powerful computational tool as applied to the field of elastodynamics have been highlighted, using the first principles of virtual work and energy conservation.

  18. Distribution and Origin of 36Cl In Allende CAIs

    Energy Technology Data Exchange (ETDEWEB)

    Matzel, J P; Jacobsen, B; Hutcheon, I D; Krot, A N; Nagashima, K; Yin, Q; Ramon, E C; Weber, P; Wasserburg, G J

    2009-12-11

    The abundance of short-lived radionuclides (SLRs) in early solar system materials provide key information about their nucleosynthetic origin and can constrain the timing of early solar system events. Excesses of {sup 36}S ({sup 36}S*) correlated with {sup 35}Cl/{sup 34}S ratios provide direct evidence for in situ decay of {sup 36}Cl ({tau}{sub 1/2} {approx} 0.3 Ma) and have been reported in sodalite (Na{sub 8}Al{sub 6}Si{sub 6}O{sub 24}Cl{sub 2}) and wadalite (Ca{sub 6}Al{sub 5}Si{sub 2}O{sub 16}Cl{sub 3}) in CAIs and chondrules from the Allende and Ningqiang CV carbonaceous chondrites. While previous studies demonstrate unequivocally that {sup 36}Cl was extant in the early solar system, no consensus on the origin or initial abundance of {sup 36}Cl has emerged. Understanding the origin of {sup 36}Cl, as well as the reported variation in the initial {sup 36}Cl/{sup 35}Cl ratio, requires addressing when, where and how chlorine was incorporated into CAIs and chondrules. These factors are key to distinguishing between stellar nucleosynthesis or energetic particle irradiation for the origin of {sup 36}Cl. Wadalite is a chlorine-rich secondary mineral with structural and chemical affinities to grossular. The high chlorine ({approx}12 wt%) and very low sulfur content (<<0.01 wt%) make wadalite ideal for studies of the {sup 36}Cl-{sup 36}S system. Wadalite is present in Allende CAIs exclusively in the interior regions either in veins crosscutting melilite or in zones between melilite and anorthite associated with intergrowths of grossular, monticellite, and wollastonite. Wadalite and sodalite most likely resulted from open-system alteration of primary minerals with a chlorine-rich fluid phase. We recently reported large {sup 36}S* correlated with {sup 35}Cl/{sup 34}S in wadalite in Allende Type B CAI AJEF, yielding a ({sup 36}Cl/{sup 35}Cl){sub 0} ratio of (1.7 {+-} 0.3) x 10{sup -5}. This value is the highest reported {sup 36}Cl/{sup 35}Cl ratio and is {approx}5 times

  19. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  20. Characterization of Meteorites by Focused Ion Beam Sectioning: Recent Applications to CAIs and Primitive Meteorite Matrices

    Science.gov (United States)

    Christoffersen, Roy; Keller, Lindsay P.; Han, Jangmi; Rahman, Zia; Berger, Eve L.

    2015-01-01

    Focused ion beam (FIB) sectioning has revolutionized preparation of meteorite samples for characterization by analytical transmission electron microscopy (TEM) and other techniques. Although FIB is not "non-destructive" in the purest sense, each extracted section amounts to no more than nanograms (approximately 500 cubic microns) removed intact from locations precisely controlled by SEM imaging and analysis. Physical alteration of surrounding material by ion damage, fracture or sputter contamination effects is localized to within a few micrometers around the lift-out point. This leaves adjacent material intact for coordinate geochemical analysis by SIMS, microdrill extraction/TIMS and other techniques. After lift out, FIB sections can be quantitatively analyzed by electron microprobe prior to final thinning, synchrotron x-ray techniques, and by the full range of state-of-the-art analytical field-emission scanning transmission electron microscope (FE-STEM) techniques once thinning is complete. Multiple meteorite studies supported by FIB/FE-STEM are currently underway at NASA-JSC, including coordinated analysis of refractory phase assemblages in CAIs and fine-grained matrices in carbonaceous chondrites. FIB sectioning of CAIs has uncovered epitaxial and other overgrowth relations between corundum-hibonite-spinel consistent with hibonite preceding corundum and/or spinel in non-equilibrium condensation sequences at combinations of higher gas pressures, dust-gas enrichments or significant nebular transport. For all of these cases, the ability of FIB to allow for coordination with spatially-associated isotopic data by SIMS provides immense value for constraining the formation scenarios of the particular CAI assemblage. For carbonaceous chondrites matrix material, FIB has allowed us to obtain intact continuous sections of the immediate outer surface of Murchison (CM2) after it has been experimentally ion processed to simulate solar wind space weathering. The surface

  1. A Project to Develop and Evaluate a Computerized System for Instructional Response Analysis; Project SIRA. Final Report.

    Science.gov (United States)

    Easley, J.A., Jr.

    Project SIRA (System for Instructional Response Analysis) used a systems approach to develop a complete range of programs and techniques both for evaluation of student performance and for evaluation and revision of computer-assisted instruction (CAI) lesson material. By use of the PLATO computer-based instructional hardware system at the…

  2. New Study Says CAI May Favor Introverts.

    Science.gov (United States)

    Hopmeier, George

    1981-01-01

    A personality research study using the Myers-Briggs Type Indicator indicates that computer-assisted instruction programs favor introverts, i.e., those learners who can concentrate on details, memorize facts, and stay with a task until it is completed. (JJD)

  3. Design Guidelines for CAI Authoring Systems.

    Science.gov (United States)

    Hunka, S.

    1989-01-01

    Discussion of the use of authoring systems for courseware development focuses on guidelines to be considered when designing authoring systems. Topics discussed include allowing a variety of instructional strategies; interaction with peripheral processes such as student records; the editing process; and human factors in computer interface design,…

  4. Computer Programs for Settlement Analysis.

    Science.gov (United States)

    1980-10-01

    This report was written by Mr. Mosher and Dr. Radha- krishnan. Mr. D. L. Neumann was Chief of the ADP Center. COL J. L. Cannon , CE, and COL N. P...n. ITERMX=I00. If the number of iterations in the Gauss - Seidel subroutine reaches 100, the calculations will be aborted and the analysis stops. 69

  5. A Comparative Evaluation of Computer Based and Non-Computer Based Instructional Strategies.

    Science.gov (United States)

    Emerson, Ian

    1988-01-01

    Compares the computer assisted instruction (CAI) tutorial with its non-computerized pedagogical roots: the Socratic Dialog with Skinner's Programmed Instruction. Tests the effectiveness of a CAI tutorial on diffusion and osmosis against four other interactive and non-interactive instructional strategies. Notes computer based strategies were…

  6. Effectiveness of Computer-Based Education in Elementary Schools.

    Science.gov (United States)

    Kulik, James A.; And Others

    1985-01-01

    This metaanalysis of 32 comparative studies shows that computer-based education has generally had positive effects on the achievement of elementary school pupils. However, these effects are different for off-line computer managed instruction and interactive computer assisted instruction (CAI); interactive CAI produces greater increases in student…

  7. The Effects of Computer-Assisted Instruction Based on Top-Level Structure Method in English Reading and Writing Abilities of Thai EFL Students

    Science.gov (United States)

    Jinajai, Nattapong; Rattanavich, Saowalak

    2015-01-01

    This research aims to study the development of ninth grade students' reading and writing abilities and interests in learning English taught through computer-assisted instruction (CAI) based on the top-level structure (TLS) method. An experimental group time series design was used, and the data was analyzed by multivariate analysis of variance…

  8. DFT computational analysis of piracetam

    Science.gov (United States)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  9. Computer Graphics for System Effectiveness Analysis.

    Science.gov (United States)

    1986-05-01

    02139, August 1982. Chapra , Steven C., and Raymond P. Canale, (1985), Numerical Methods for Engineers with Personal Computer Applications New York...I -~1.2 Outline of Thesis .................................. 1..... .......... CHAPTER 11. METHOD OF ANALYSIS...Chapter VII summarizes the results and gives recommendations for future research. I - P** METHOD OF ANALYSIS 2.1 Introduction Systems effectiveness

  10. Automated COBOL code generation for SNAP-I CAI development and maintenance procedures

    Energy Technology Data Exchange (ETDEWEB)

    Buhrmaster, M.A.; Duncan, L.D.; Hume, R.; Huntley, A.F.

    1988-07-01

    In designing and implementing a computer aided instruction (CAI) prototype for the Navy Management System Support Office (NAVMASSO) as part of the Shipboard Nontactical ADP Program (SNAP), Data Systems Engineering Organization (DSEO) personnel developed techniques for automating the production of COBOL source code for CAI applications. This report discusses the techniques applied, which incorporate the use of a database management system (DBMS) to store, access, and manipulate the data necessary for producing COBOL source code automatically. The objective for developing the code generation techniques is to allow for the production of future applications in an efficient and reliable manner. This report covers the standards and conventions defined, database tables created, and the host language interface program used for generating COBOL source files. The approach is responsible for producing 85 percent of an 830,000 line COBOL application, in approximately one year's time. This code generation program generated transaction processing routines to be executed under the DM6TP NAVMASSO distributed processing environment on the Honeywell DPS-6 minicomputers, representing the standard SNAP-I environment.

  11. Computer Auxiliary Analysis for Stochasticity of Chaos

    Institute of Scientific and Technical Information of China (English)

    ZHAOGeng; FANGJin-qing

    2003-01-01

    In this work, we propose a mathematics-physical statistic analytical method for stochastic process of chaos, based on stochastic test via combination measurement of Monobit and Runs. Computer auxiliary analysis shows that it is of stochasticity for stochastic number produced from the chaotic circuit. Our software is written by VB and C++, the later can be tested by the former, and at the same time it is verified by stochastic number produced by the computer. So the data treatment results are reliable.

  12. Du Bollettino del CAI à la revue Le Alpi

    Directory of Open Access Journals (Sweden)

    Jean-Paul Zuanon

    2001-05-01

    Full Text Available Comme les autres clubs alpins créés à la même époque, le CAI s’est rapidement doté d’outils de communication pour établir un lien entre ses membres et faire connaître sa philosophie en matière de pratique de la montagne.   Fondé en 1865, le bulletin trimestriel a été complété par la Rivista mensile en 1881. De fait,  l’organe du CAI est vieux de près de 140 ans.  Il est un reflet fidèle de l’évolution du club et des grands débats qui l’ont animé. Sans prétendre retracer cette longue histoire,...

  13. Computer Language Effciency via Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Andrea Ellero

    2011-01-01

    Full Text Available The selection of the computer language to adopt is usually driven by intuition and expertise, since it is very diffcult to compare languages taking into account all their characteristics. In this paper, we analyze the effciency of programming languages through Data Envelopment Analysis. We collected the input data from The Computer Language Benchmarks Game: we consider a large set of languages in terms of computational time, memory usage, and source code size. Various benchmark problems are tackled. We analyze the results first of all considering programming languages individually. Then, we evaluate families of them sharing some characteristics, for example, being compiled or interpreted.

  14. Evidence for an early nitrogen isotopic evolution in the solar nebula from volatile analyses of a CAI from the CV3 chondrite NWA 8616

    Science.gov (United States)

    Füri, Evelyn; Chaussidon, Marc; Marty, Bernard

    2015-03-01

    Nitrogen and noble gas (Ne-Ar) abundances and isotope ratios, determined by CO2 laser extraction static mass spectrometry analysis, as well as Al-Mg and O isotope data from secondary ion mass spectrometry (SIMS) analyses, are reported for a type B calcium-aluminum-rich inclusion (CAI) from the CV3 chondrite NWA 8616. The high (26Al/27Al)i ratio of (5.06 ± 0.50) × 10-5 dates the last melting event of the CAI at 39-99+109ka after "time zero", limiting the period during which high-temperature exchanges between the CAI and the nebular gas could have occurred to a very short time interval. Partial isotopic exchange with a 16O-poor reservoir resulted in Δ17O > -5‰ for melilite and anorthite, whereas spinel and Al-Ti-pyroxene retain the inferred original 16O-rich signature of the solar nebula (Δ17O ⩽ -20‰). The low 20Ne/22Ne (⩽0.83) and 36Ar/38Ar (⩽0.75) ratios of the CAI rule out the presence of any trapped planetary or solar noble gases. Cosmogenic 21Ne and 38Ar abundances are consistent with a cosmic ray exposure (CRE) age of ∼14 to 20 Ma, assuming CR fluxes similar to modern ones, without any evidence for pre-irradiation of the CAI before incorporation into the meteorite parent body. Strikingly, the CAI contains 1.4-3.4 ppm N with a δ15N value of +8‰ to +30‰. Even after correcting the measured δ15N values for cosmogenic 15N produced in situ, the CAI is highly enriched in 15N compared to the protosolar nebula (δ15NPSN = -383 ± 8‰; Marty et al., 2011), implying that the CAI-forming region was contaminated by 15N-rich material within the first 0.15 Ma of Solar System history, or, alternatively, that the CAI was ejected into the outer Solar System where it interacted with a 15N-rich reservoir.

  15. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  16. The computer in shell stability analysis

    Science.gov (United States)

    Almroth, B. O.; Starnes, J. H., Jr.

    1975-01-01

    Some examples in which the high-speed computer has been used to improve the static stability analysis capability for general shells are examined. The fundamental concepts of static stability are reviewed with emphasis on the differences between linear bifurcation buckling and nonlinear collapse. The analysis is limited to the stability of conservative systems. Three examples are considered. The problem of cylinders subjected to bending loads is used as an example to illustrate that a simple structure can have a sufficiently complicated nonlinear behavior to require a computer analysis for accurate results. An analysis of the problems involved in the modeling of stiffening elements in plate and shell structures illustrates the necessity that the analyst recognizes all important deformation modes. The stability analysis of the Skylab structure indicates the size of problems that can be solved with current state-of-the-art capability.

  17. Silicon Isotopic Fractionation of CAI-like Vacuum Evaporation Residues

    Energy Technology Data Exchange (ETDEWEB)

    Knight, K; Kita, N; Mendybaev, R; Richter, F; Davis, A; Valley, J

    2009-06-18

    Calcium-, aluminum-rich inclusions (CAIs) are often enriched in the heavy isotopes of magnesium and silicon relative to bulk solar system materials. It is likely that these isotopic enrichments resulted from evaporative mass loss of magnesium and silicon from early solar system condensates while they were molten during one or more high-temperature reheating events. Quantitative interpretation of these enrichments requires laboratory determinations of the evaporation kinetics and associated isotopic fractionation effects for these elements. The experimental data for the kinetics of evaporation of magnesium and silicon and the evaporative isotopic fractionation of magnesium is reasonably complete for Type B CAI liquids (Richter et al., 2002, 2007a). However, the isotopic fractionation factor for silicon evaporating from such liquids has not been as extensively studied. Here we report new ion microprobe silicon isotopic measurements of residual glass from partial evaporation of Type B CAI liquids into vacuum. The silicon isotopic fractionation is reported as a kinetic fractionation factor, {alpha}{sub Si}, corresponding to the ratio of the silicon isotopic composition of the evaporation flux to that of the residual silicate liquid. For CAI-like melts, we find that {alpha}{sub Si} = 0.98985 {+-} 0.00044 (2{sigma}) for {sup 29}Si/{sup 28}Si with no resolvable variation with temperature over the temperature range of the experiments, 1600-1900 C. This value is different from what has been reported for evaporation of liquid Mg{sub 2}SiO{sub 4} (Davis et al., 1990) and of a melt with CI chondritic proportions of the major elements (Wang et al., 2001). There appears to be some compositional control on {alpha}{sub Si}, whereas no compositional effects have been reported for {alpha}{sub Mg}. We use the values of {alpha}Si and {alpha}Mg, to calculate the chemical compositions of the unevaporated precursors of a number of isotopically fractionated CAIs from CV chondrites whose

  18. An investigation of CAI teaching methods in an electronics course

    Science.gov (United States)

    Wood, Kenneth W.

    1982-08-01

    Computers are increasingly being used in the classroom. An investigation of several educational techniques in a computer-based version of an electronics course is reported. We found that, with a lesson for teaching virtual equality, students learned faster when using a general to specific approach. Students using a simulation of a Schmitt trigger before a qualitative analysis of the circuit performed the analysis faster and with less difficulty than the group performing the analysis first and then exploring the circuit with the simulation. Given a sizable amount of optional material in a computer lesson, most of the electronics students used all of the optional material.

  19. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  20. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  1. The Intelligent CAI System for Chemistry Based on Automated Reasoning

    Institute of Scientific and Technical Information of China (English)

    王晓京; 张景中

    1999-01-01

    A new type of intelligent CAI system for chemistry is developed in this paper based on automated reasoning with chemistry knowledge.The system has shown its ability to solve chemistry problems,to assist students and teachers in studies and instruction with the automated reasoning functions.Its open mode of the knowledge base and its unique style of the interface between the system and human provide more opportunities for the users to acquire living knowledge through active participation.The automated reasoning based on basic chemistry knowledge also opened a new approach to the information storage and management of the ICAI system for sciences.

  2. Numerical Investigation Into Effect of Fuel Injection Timing on CAI/HCCI Combustion in a Four-Stroke GDI Engine

    Science.gov (United States)

    Cao, Li; Zhao, Hua; Jiang, Xi; Kalian, Navin

    2006-02-01

    The Controlled Auto-Ignition (CAI) combustion, also known as Homogeneous Charge Compression Ignition (HCCI), was achieved by trapping residuals with early exhaust valve closure in conjunction with direct injection. Multi-cycle 3D engine simulations have been carried out for parametric study on four different injection timings in order to better understand the effects of injection timings on in-cylinder mixing and CAI combustion. The full engine cycle simulation including complete gas exchange and combustion processes was carried out over several cycles in order to obtain the stable cycle for analysis. The combustion models used in the present study are the Shell auto-ignition model and the characteristic-time combustion model, which were modified to take the high level of EGR into consideration. A liquid sheet breakup spray model was used for the droplet breakup processes. The analyses show that the injection timing plays an important role in affecting the in-cylinder air/fuel mixing and mixture temperature, which in turn affects the CAI combustion and engine performance.

  3. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  4. Computational analysis of aircraft pressure relief doors

    Science.gov (United States)

    Schott, Tyler

    Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft

  5. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  6. Structural basis of Na+-independent and cooperative substrate/product antiport in CaiT

    NARCIS (Netherlands)

    Schulze, Sabrina; Köster, Stefan; Geldmacher, Ulrike; Terwisscha van Scheltinga, Anke C.; Kühlbrandt, Werner

    2010-01-01

    Transport of solutes across biological membranes is performed by specialized secondary transport proteins in the lipid bilayer, and is essential for life. Here we report the structures of the sodium-independent carnitine/butyrobetaine antiporter CaiT from Proteus mirabilis (PmCaiT) at 2.3-Å and from

  7. Brief Introduction to the Foundation of CAI Shidong Award for Plasma Physics

    Institute of Scientific and Technical Information of China (English)

    SHENG Zhengming

    2010-01-01

    @@ The late Academician Professor CAI Shidong was an outstanding plasma physicist who had made seminal contributions in both fundamental plasma theories and controlled thermonuclear fusion energy research.Professor CAI was also one of the pioneers in China's plasma physics research.In 1973,Professor CAI decided to leave U.S.and return to China in order to help pushing forward plasma physics research in China.Professor CAI formed a research group consisting of young scientists and carried out high-level works in this important physics discipline.He worked tirelessly,set examples by his own deeds,and made outstanding contributions in plasma physics research,educating younger generations of plasma physicists,as well as establishing collaborations with plasma scientists in other Asian-African developing nations.In short,Professor CAI devoted the best years of his life to China's plasma physics research.

  8. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  9. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  10. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  11. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  12. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  13. Pooled shRNA screenings: computational analysis.

    Science.gov (United States)

    Yu, Jiyang; Putcha, Preeti; Califano, Andrea; Silva, Jose M

    2013-01-01

    Genome-wide RNA interference screening has emerged as a powerful tool for functional genomic studies of disease-related phenotypes and the discovery of molecular therapeutic targets for human diseases. Commercial short hairpin RNA (shRNA) libraries are commonly used in this area, and state-of-the-art technologies including microarray and next-generation sequencing have emerged as powerful methods to analyze shRNA-triggered phenotypes. However, computational analysis of this complex data remains challenging due to noise and small sample size from such large-scaled experiments. In this chapter we discuss the pipelines and statistical methods of processing, quality assessment, and post-analysis for both microarray- and sequencing-based screening data.

  14. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  15. How CAI Correctly Play a Role in Teaching%如何在教学中正确发挥CAI的作用

    Institute of Scientific and Technical Information of China (English)

    苏醒

    2011-01-01

    CAI (Computer-assisted Instruction) refers to transmit the information in teaching using computer and computer technology to complete the task of teaching and achieve educational purposes. CAI can integrate animation, sound, text etc. together, which not only be help to teaching, but also help students form new ideas, new concepts and new methods in the learning process, and is a powerful tool and form of developing student potential and developing their intelligence and ability. However, in actual use there were many problems. The foUowing is my view about the role of CAI in the classroom teaching, according to my personal experience in teaching.%CAI(计算机辅助教学)是指在教学活动中,利用计算机及其技术传导教学过程中的信息,完成教学任务,达到教学目的.利用CAI能使动画、声音,文字等地切入融为一体的特点,不仅有利于教学,更有利于学生在学习过程中形成新思想、新观念和新方法,是开发学生潜能,发展学生智力和能力的有力工具和形式.但是在实际使用过程中又出现了许多的问题,以下是我根据个人教学经验谈谈CAI在课堂教学中应用到底发挥多大作用.

  16. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  17. Sigal's Ineffective Computer-Based Practice of Arithmetic: A Case Study.

    Science.gov (United States)

    Hativa, Nira

    1988-01-01

    A student was observed practicing arithmetic with a computer-assisted instruction (CAI) system. She enjoyed practice and believed that it helped. However, she consistently failed to solve problems on the computer that she could do with pencil and paper. This paper suggests reasons for her problems and draws implications for CAI. (Author/PK)

  18. Computational analysis of unmanned aerial vehicle (UAV)

    Science.gov (United States)

    Abudarag, Sakhr; Yagoub, Rashid; Elfatih, Hassan; Filipovic, Zoran

    2017-01-01

    A computational analysis has been performed to verify the aerodynamics properties of Unmanned Aerial Vehicle (UAV). The UAV-SUST has been designed and fabricated at the Department of Aeronautical Engineering at Sudan University of Science and Technology in order to meet the specifications required for surveillance and reconnaissance mission. It is classified as a medium range and medium endurance UAV. A commercial CFD solver is used to simulate steady and unsteady aerodynamics characteristics of the entire UAV. In addition to Lift Coefficient (CL), Drag Coefficient (CD), Pitching Moment Coefficient (CM) and Yawing Moment Coefficient (CN), the pressure and velocity contours are illustrated. The aerodynamics parameters are represented a very good agreement with the design consideration at angle of attack ranging from zero to 26 degrees. Moreover, the visualization of the velocity field and static pressure contours is indicated a satisfactory agreement with the proposed design. The turbulence is predicted by enhancing K-ω SST turbulence model within the computational fluid dynamics code.

  19. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...... have become the standard with which to assess emphysema extent but airway abnormalities have so far been more challenging to quantify. Automated methods for analysis are indispensable as the visible airway tree in a CT scan can include several hundreds of individual branches. However, automation...... of scan on airway dimensions in subjects with and without COPD. The results show measured airway dimensions to be affected by differences in the level of inspiration and this dependency is again influenced by COPD. Inspiration level should therefore be accounted for when measuring airways, and airway...

  20. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  1. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  2. A computational design system for rapid CFD analysis

    Science.gov (United States)

    Ascoli, E. P.; Barson, S. L.; Decroix, M. E.; Sindir, Munir M.

    1992-01-01

    A computation design system (CDS) is described in which these tools are integrated in a modular fashion. This CDS ties together four key areas of computational analysis: description of geometry; grid generation; computational codes; and postprocessing. Integration of improved computational fluid dynamics (CFD) analysis tools through integration with the CDS has made a significant positive impact in the use of CFD for engineering design problems. Complex geometries are now analyzed on a frequent basis and with greater ease.

  3. Schlieren sequence analysis using computer vision

    Science.gov (United States)

    Smith, Nathanial Timothy

    Computer vision-based methods are proposed for extraction and measurement of flow structures of interest in schlieren video. As schlieren data has increased with faster frame rates, we are faced with thousands of images to analyze. This presents an opportunity to study global flow structures over time that may not be evident from surface measurements. A degree of automation is desirable to extract flow structures and features to give information on their behavior through the sequence. Using an interdisciplinary approach, the analysis of large schlieren data is recast as a computer vision problem. The double-cone schlieren sequence is used as a testbed for the methodology; it is unique in that it contains 5,000 images, complex phenomena, and is feature rich. Oblique structures such as shock waves and shear layers are common in schlieren images. A vision-based methodology is used to provide an estimate of oblique structure angles through the unsteady sequence. The methodology has been applied to a complex flowfield with multiple shocks. A converged detection success rate between 94% and 97% for these structures is obtained. The modified curvature scale space is used to define features at salient points on shock contours. A challenge in developing methods for feature extraction in schlieren images is the reconciliation of existing techniques with features of interest to an aerodynamicist. Domain-specific knowledge of physics must therefore be incorporated into the definition and detection phases. Known location and physically possible structure representations form a knowledge base that provides a unique feature definition and extraction. Model tip location and the motion of a shock intersection across several thousand frames are identified, localized, and tracked. Images are parsed into physically meaningful labels using segmentation. Using this representation, it is shown that in the double-cone flowfield, the dominant unsteady motion is associated with large scale

  4. Design of CAI Courseware Based on Virtual Reality Mechanism%基于VR机制的CAI课件设计

    Institute of Scientific and Technical Information of China (English)

    管群

    2001-01-01

    In this paper,the application feature and significance of VR technology in the educational field are summarized.In particular,the design mechanism of CAI courseware of the instruction aiming at individuals is studied,and with virtual reality mechanism a learning-while-doing environment is realized for the user in the CAI courseware in the major of the computer application.The design theory,the technique way,some of the algorithm flowchart and the interface of the exercise of operation are given.%论述了虚拟现实技术在教育领域中的应用特点和重要意义。特别研究了针对个别化教学的CAI课件设计机制,并运用虚拟现实机制在计算机应用类CAI课件设计中实现了一个可供用户边学边做的学习环境。给出了设计原理、技术路线、部分算法流程和操作练习界面。

  5. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  6. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  7. 77 FR 9625 - Presentation of Final Conventional Conformance Test Criteria and Common Air Interface (CAI...

    Science.gov (United States)

    2012-02-17

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE National Institute of Standards and Technology Presentation of Final Conventional Conformance Test Criteria and Common Air Interface (CAI) Features/Functionalities Under Test in the Project 25...

  8. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    Directory of Open Access Journals (Sweden)

    V. V. Postnov

    2014-07-01

    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  9. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  10. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  11. Computer assistance in clinical functional analysis.

    Science.gov (United States)

    Ahlers, M O; Jakstat, H A

    2002-10-01

    The use of computers in the dental practice has been primarily restricted to the acquisition of billing data. Additional possibilities for use of PCs exist in diagnostic data acquisition and evaluation; clinical functional analysis seems a particularly suitable application. Such software is now available: CMDfact. Dentally, it is based on a previously developed and published examination and documentation system, the graphic user interface of which is used in the newly developed software. After the examination data have been acquired by mouse click or numerical entry, these are available for evaluation. A special function, the "Diagnosis pilot" is integrated to support the user. This helps in the assignment of the appropriate "Initial diagnoses", since it brings together the individually existing principal symptoms and suitable diagnoses for the initial diagnosis in question and also states which diagnoses "would be appropriate" for this, but are not available. With 3D animation, the software also helps the dentist to explain aspects of CMD to patients. The software also assists the dentist with a detailed multimedia help system, which provides context-sensitive help for every examination step. These help functions explain the sense of the relevant examinations, their performance and evaluation in the form of short texts and explanatory photographs and videos.

  12. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  13. Formation of refractory metal nuggets and their link to the history of CAIs

    Science.gov (United States)

    Schwander, D.; Kööp, L.; Berg, T.; Schönhense, G.; Heck, P. R.; Davis, A. M.; Ott, U.

    2015-11-01

    Ca, Al-rich inclusions (CAIs) often contain numerous refractory metal nuggets (RMNs), consisting of elements like Os, Ir, Mo, Pt and Ru. The nuggets are usually thought to have formed by equilibrium condensation from a gas of solar composition, simultaneously with or prior to oxide and silicate minerals. However, the exact mechanisms responsible for their extremely variable compositions, small sizes and associations with CAI minerals remain puzzling. Expanding on previous work on chemically separated RMNs, we have studied a large number of RMNs within their host CAIs from three different meteorite types, i.e., the highly primitive chondrite Acfer 094 (C2-ungrouped), Allende (CV3ox) and Murchison (CM2). Our results show several inconsistencies between the observed features and a direct condensation origin, including a lack of correlated abundance variations in the refractory metals that are expected from variations in condensation temperature. Instead, we show that most RMN features are consistent with RMN formation by precipitation from a CAI liquid enriched in refractory metals. This scenario is additionally supported by the common occurrence of RMNs in CAIs with clear melt crystallization textures as well as the occurrence of synthetic RMNs with highly variable compositions in run products from Schwander et al. (2015). In some cases, the sizes of meteoritic RMNs correlate with the sizes of their host minerals in CAIs, which indicates common cooling rates.

  14. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  15. CAI的设计与制作%CAI Design and Manufacture

    Institute of Scientific and Technical Information of China (English)

    汪岩; 汪鹰扬

    2011-01-01

    从学科教师的角度探讨多媒体CAI课件的开发思路和制作技术。发挥计算机多媒体技术的特点和优势,推动高教环境教育的发展。多媒体教学更加生动,内容更加丰富,使环境教育知识更容易被接受,多媒体课件能够用漂亮的图形界面提高使用者的学习兴趣,以丰富的声音和影片剪辑使学习更有效果。%From teachers' perspective of multimedia CAI courseware development train of thought and making technology. Play the computer multimedia technology characteristics and advantages, promote the development of environmental education in higher education. Multimedia teaching more lively, content is more rich, make environmental education knowledge more easily accepted, multimedia courseware can use beautiful graphical interface for users to improve the interest in learning, to enrich the sound and video clips to make learning be personally on the scene.

  16. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  17. Dynamic Stall Analysis Utilizing Interactive Computer Graphics

    Science.gov (United States)

    1988-03-01

    Blade-Vortex Interaction (BV[) studies. solkes the two-dimen i,)nal, unsteady, compressible Euler and Napier -Stokes equations in strong conservation...requirements, interactive computer graphics workstations have been evolved to complement the super -computer. Workstation capabilities, in terms of

  18. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  19. Fault-Tolerant Postselected Quantum Computation: Threshold Analysis

    CERN Document Server

    Knill, E

    2004-01-01

    The schemes for fault-tolerant postselected quantum computation given in [Knill, Fault-Tolerant Postselected Quantum Computation: Schemes, http://arxiv.org/abs/quant-ph/0402171] are analyzed to determine their error-tolerance. The analysis is based on computer-assisted heuristics. It indicates that if classical and quantum communication delays are negligible, then scalable qubit-based quantum computation is possible with errors above 1% per elementary quantum gate.

  20. Introduction to numerical analysis and scientific computing

    CERN Document Server

    Nassif, Nabil

    2013-01-01

    Computer Number Systems and Floating Point Arithmetic Introduction Conversion from Base 10 to Base 2Conversion from Base 2 to Base 10Normalized Floating Point SystemsFloating Point OperationsComputing in a Floating Point SystemFinding Roots of Real Single-Valued Functions Introduction How to Locate the Roots of a Function The Bisection Method Newton's Method The Secant MethodSolving Systems of Linear Equations by Gaussian Elimination Mathematical Preliminaries Computer Storage for Matrices. Data Structures Back Substitution for Upper Triangular Systems Gauss Reduction LU DecompositionPolynomia

  1. Granular computing analysis and design of intelligent systems

    CERN Document Server

    Pedrycz, Witold

    2013-01-01

    Information granules, as encountered in natural language, are implicit in nature. To make them fully operational so they can be effectively used to analyze and design intelligent systems, information granules need to be made explicit. An emerging discipline, granular computing focuses on formalizing information granules and unifying them to create a coherent methodological and developmental environment for intelligent system design and analysis. Granular Computing: Analysis and Design of Intelligent Systems presents the unified principles of granular computing along with its comprehensive algo

  2. Computational Notes on the Numerical Analysis of Galactic Rotation Curves

    CERN Document Server

    Scelza, G

    2014-01-01

    In this paper we present a brief discussion on the salient points of the computational analysis that are at the basis of the paper \\cite{StSc}. The computational and data analysis have been made with the software Mathematica$^\\circledR$ and presented at Mathematica Italia User Group Meeting 2011.

  3. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van Keul

  4. Behavior computing modeling, analysis, mining and decision

    CERN Document Server

    2012-01-01

    Includes six case studies on behavior applications Presents new techniques for capturing behavior characteristics in social media First dedicated source of references for the theory and applications of behavior informatics and behavior computing

  5. Computer analysis of slow vital capacity spirograms.

    Science.gov (United States)

    Primiano, F P; Bacevice, A E; Lough, M D; Doershuk, C F

    1982-01-01

    We have developed a digital computer program which evaluates the vital capacity and its subdivisions, expiratory reserve volume and inspiratory capacity. The algorithm examines the multibreath spirogram, a continuous record of quiet breathing interspersed among repeated slow, large volume maneuvers. Quiet breaths are recognized by comparing features of each breath to the respective average and variation of these features for all breaths. A self-scaling, iterative procedure is used to identify those end-tidal points that most likely represent the subject's functional residual capacity. A least-squared error baseline is then fit through these points to partition the vital capacity. Twenty-three spirograms from patients with documented pulmonary disease were independently analyzed by the computer, a pulmonary function technician, and the laboratory supervisor. No practical differences were found among the results. However, the computer's values, in contrast to those of the technician, were reproducible on repeated trials and free of computational and transcriptional errors.

  6. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  7. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  8. Development and Evaluation of Computer Assisted Instruction for Navy Electronics Training. Two, Inductance.

    Science.gov (United States)

    Hurlock, Richard E.

    A computer-assisted instruction (CAI) curriculum module covering the area of electrical inductance was developed and evaluated. This module was a part of a program in which a series of CAI modules are being developed and tested for a Navy training course in basic electronics. After the module was written, it was given three tryout tests.…

  9. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Science.gov (United States)

    Garrett, T. J.; And Others

    1987-01-01

    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  10. Computer Assisted Instruction in Mathematics Can Improve Students' Test Scores: A Study.

    Science.gov (United States)

    Brown, Frank

    This research assessed the academic impact of a computer-assisted instructional (CAI) software program to teach mathematics. The research hypothesis states that the use of the CAI program will produce superior academic achievement in mathematics for students who use the program compared to students instructed in mathematics without the program.…

  11. An Evaluation of Computer-Aided Instruction in an Introductory Biostatistics Course.

    Science.gov (United States)

    Forsythe, Alan B.; Freed, James R.

    1979-01-01

    Evaluates the effectiveness of computer assisted instruction for teaching biostatistics to first year students at the UCLA School of Dentistry. Results do not demonstrate the superiority of CAI but do suggest that CAI compares favorably to conventional lecture and programed instruction methods. (RAO)

  12. Automatization of Mathematics Skills via Computer-Assisted Instruction among Students with Mild Mental Handicaps.

    Science.gov (United States)

    Podell, David M.; And Others

    1992-01-01

    This evaluation study with 94 elementary students (50 with mild mental handicap) compared computer-assisted instruction (CAI) and paper-and-pencil practices in promoting automatization of basic addition and subtraction skills. Findings suggested CAI was more effective but that the students with mental handicap required more practice than…

  13. The Analysis of Some Contemporary Computer Mikrosystems

    Directory of Open Access Journals (Sweden)

    Angelė Kaulakienė

    2011-04-01

    Full Text Available In every language a twofold process could be observed: 1 a huge surge of new terms and 2 a big part of these new terms make their way into the common language. The nucleus of the vocabulary and the grammatical system of the common language make the essence of a language and its national originality. Because of such an intensive development in the future terminological lexis can become a basis of a common language and it ought to be not a spontaneously formed sum of terminological lexis, but an entirety of consciously created terms, which meet the requirements of language, logic and terminology. Computer terminology, by comparison with terminology of other fields, is being created in a slightly unusual way. The first computation institutions in Lithuania were established in early sixties and a decade later there were a few computation centres and a number of key-operated and punch machines working. Together with the new computational technology many new devices, units, parts, phenomena and characteristics appeared, which needed naming. Specialists faced an obvious shortage of Lithuanian terms for computing equipment. In 1971 this gap was partly filled by „Rusų-lietuvių-anglų kalbų skaičiavimo technikos žodynas“ (Russian-Lithuanian-English dictionary of computing equipment, which for a long time (for more than 20 years was the only one terminological dictionary of this field. Only during nineties a few dictionaries of different scope appeared. Computer terminology from ten dictionaries, which are presently available, shows that 35 year period of computer terminology is a stage of its creation, the main features of which are reasonable synonymy (when both international term are being used to name the concept and variability. Such state of Lithuanian computer terminology is predetermined by some linguistic, interlinguistic and sociolinguistic factors. At present in Lithuania terminological dictionaries of various fields are being given to

  14. Computational Analysis of LDDMM for Brain Mapping

    Directory of Open Access Journals (Sweden)

    Can eCeritoglu

    2013-08-01

    Full Text Available One goal of computational anatomy is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids’ brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms – Freesurfer and FSL – by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations.

  15. Fine-Gained CAIs in Comet Samples: Moderate Refractory Character and Comparison to Small Refractory Inclusions in Chondrites

    Science.gov (United States)

    Joswiak, D. J.; Brownlee, D. E.; Nguyen, A. N.; Messenger, S

    2017-01-01

    Examination of >200 comet Wild 2 particles collected by the Stardust (SD) mission shows that the CAI abundance of comet Wild 2's rocky material is near 1% and that nearly 50% of all bulbous tracks will contain at least one recognizable CAI fragment. A similar abundance to Wild 2 is found in a giant cluster IDP thought to be of cometary origin. The properties of these CAIs and their comparison with meteoritic CAIs provide important clues on the role of CAIs in the early Solar System (SS) and how they were transported to the edge of the solar nebula where Kuiper Belt comets formed. Previously, only two CAIs in comet Wild 2 had been identified and studied in detail. Here we present 2 new Wild 2 CAIs and 2 from a giant cluster cometary IDP, describe their mineralogical characteristics and show that they are most analogous to nodules in spinel-rich, fine-grained inclusions (FGIs) observed in CV3 and other chondrites. Additionally, we present new O isotope measurements from one CAI from comet Wild 2 and show that its oxygen isotopic composition is similar to some FGIs. This is only the second CAI from Wild 2 in which O isotopes have been measured.

  16. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  17. Cloud Computing for Rigorous Coupled-Wave Analysis

    Directory of Open Access Journals (Sweden)

    N. L. Kazanskiy

    2012-01-01

    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  18. Mineralogy and Petrology of EK-459-5-1, A Type B1 CAI from Allende

    Science.gov (United States)

    Jeffcoat, C. R.; Kerekgyarto, A. G.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2015-01-01

    Calcium-aluminum-rich inclusions (CAIs) are a type of coarse-grained clast composed of Ca-, Al-, and Mg-rich silicates and oxides found in chondrite meteorites. Type B (CAIs) are exclusively found in the CV chondrite meteorites and are the most well studied type of inclusion found in chondritic meteorites. Type B1 CAIs are distinguished by a nearly monomineralic rim of melilite that surrounds an interior predominantly composed of melilite, fassaite (Ti and Al-rich clinopyroxene), anorthite, and spinel with varying amounts of other minor primary and secondary phases. The formation of Type B CAIs has received considerable attention in the course of CAI research and quantitative models, experimental results and observations from Type B inclusions remain largely in disagreement. Recent experimental results and quantitative models have shown that the formation of B1 mantles could have occurred by the evaporative loss of Si and Mg during the crystallization of these objects. However, comparative studies suggest that the lower bulk SiO2 compositions in B1s result in more prior melilite crystallization before the onset of fassaite and anorthite crystallization leading to the formation of thick melilite rich rims in B1 inclusions. Detailed petrographic and cosmochemical studies of these inclusions will further our understanding of these complex objects.

  19. Conversation Analysis of Computer-Mediated Communication

    Science.gov (United States)

    Gonzalez-Lloret, Marta

    2011-01-01

    The potential of computer-mediated communication (CMC) for language learning resides mainly in the possibility that learners have to engage with other speakers of the language, including L1 speakers. The inclusion of CMC in the L2 classroom provides an opportunity for students to utilize authentic language in real interaction, rather than the more…

  20. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  1. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  3. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  4. Computational analysis of ozonation in bubble columns

    Energy Technology Data Exchange (ETDEWEB)

    Quinones-Bolanos, E. [Univ. of Guelph, School of Engineering, Guelph, Ontario (Canada)]|[Univ. de Cartagena, Facultad de Ciencias e Ingenieria, Cartagena de Indias (Colombia); Zhou, H.; Otten, L. [Univ. of Guelph, School of Engineering, Guelph, Ontario (Canada)]. E-mail: hzhou@uoguelph.ca

    2002-06-15

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  5. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  6. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  7. CAI versus Paper and Pencil--Discrepancies in Students' Performance.

    Science.gov (United States)

    Hativa, Nira

    1988-01-01

    This study identified differences in elementary school students' performance of arithmetic tasks using paper and pencil and computer-assisted instruction. Many were found to perform more poorly using the computer, while others showed the opposite tendency. These findings challenge the validity of decisions made by the computer-based management…

  8. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  9. CAI Courseware Designing Based on Constructivism%基于建构主义学习理论的CAI课件设计

    Institute of Scientific and Technical Information of China (English)

    李春艳; 王凡; 张积东

    2001-01-01

    CAI(Computer Assisted Instruction) is effective means in instruction field and will be used more and more. Meanwhile constructivism is a kind of theory that much accords with men's cognition regularity.So the combination of the both is a beneficial exploration.%CAI是广泛使用且行之有效的教学辅助手段,而建构主义是更符合人类认知规律的一种学习理论。两者的结合是新型的CAI课件设计的有益探索。

  10. Arginine oscillation explains Na+ independence in the substrate/product antiporter CaiT.

    Science.gov (United States)

    Kalayil, Sissy; Schulze, Sabrina; Kühlbrandt, Werner

    2013-10-22

    Most secondary-active transporters transport their substrates using an electrochemical ion gradient. In contrast, the carnitine transporter (CaiT) is an ion-independent, l-carnitine/γ-butyrobetaine antiporter belonging to the betaine/carnitine/choline transporter family of secondary transporters. Recently determined crystal structures of CaiT from Escherichia coli and Proteus mirabilis revealed an inverted five-transmembrane-helix repeat similar to that in the amino acid/Na(+) symporter LeuT. The ion independence of CaiT makes it unique in this family. Here we show that mutations of arginine 262 (R262) make CaiT Na(+)-dependent. The transport activity of R262 mutants increased by 30-40% in the presence of a membrane potential, indicating substrate/Na(+) cotransport. Structural and biochemical characterization revealed that R262 plays a crucial role in substrate binding by stabilizing the partly unwound TM1' helix. Modeling CaiT from P. mirabilis in the outward-open and closed states on the corresponding structures of the related symporter BetP reveals alternating orientations of the buried R262 sidechain, which mimic sodium binding and unbinding in the Na(+)-coupled substrate symporters. We propose that a similar mechanism is operative in other Na(+)/H(+)-independent transporters, in which a positively charged amino acid replaces the cotransported cation. The oscillation of the R262 sidechain in CaiT indicates how a positive charge triggers the change between outward-open and inward-open conformations as a unifying critical step in LeuT-type transporters.

  11. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  12. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  13. Error computation for adaptive finite element analysis

    CERN Document Server

    Khan, A A; Memon, I R; Ming, X Y

    2002-01-01

    The paper gives a simple numerical procedure for computations of errors generated by the discretisation process of finite element method. The procedure given is based on the ZZ error estimator which is believed to be reasonable accurate and thus can be readily implemented in any existing finite element codes. The devised procedure not only estimates the global energy norm error but also evaluates the local errors in individual elements. In the example, the given procedure is combined with an adaptive refinement procedure, which provides guidance for optimal mesh designing and allows the user to obtain a desired accuracy with a limited number of interaction. (author)

  14. Computer-aided Analysis of Phisiological Systems

    Directory of Open Access Journals (Sweden)

    Balázs Benyó

    2007-12-01

    Full Text Available This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and stateclassification to detect brain ischemia by means of EEG signal processing; Detection ofbreathing disorders like apnea and hypopnea; Molecular biology studies with DNA-chips;Evaluation of the cry of normal hearing and hard of hearing infants.

  15. Military Standard Common APSE (Ada Programming Support Environment) Interface Set (CAIS).

    Science.gov (United States)

    1985-01-01

    Package FILE-IMPORT.EXPORT The CAIS allows a particula : CAIS implementation to maintain fies separately from nle maintained by the host file system. This...function EXACT(LrST : in LIST TYPE; NANO : in TOKEN TYPE) return NUWDER; 207 PROPO’ FP MtL-S.TD-C jJS 31 JNNIARIY 19W.% Purpose: This function locates...NAME STRING) RESULT: LIST TEXT(1., 10); begin null; -- should be defined by Implementor end DELETE; procedure DELETECLIST: in out LISTTYPE; NANO

  16. Germination of white radish, buckwheat and qing-geng-cai under low pressure in closed environment.

    Science.gov (United States)

    Hinokuchi, Tsutomu; Oshima, Satoshi; Hashimoto, Hirofumi

    2004-11-01

    In order to cultivate plants under low pressure in closed environment, the germination rate of seeds of white radish was investigated under low pressure, low oxygen partial pressure and condition of pure oxygen. The result of these experiments showed that the germination rate was affected by the oxygen partial pressure. From this fact, it is possible to lower the total pressure by using only the pure oxygen in germination. Furthermore, the germination rates of seeds of buckwheat and qing-geng-cai were also investigated in pure oxygen for the comparison. Consequently, though tendency in germination rate of white radish was similar to qing-geng-cai, it was different from buckwheat.

  17. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  18. Two computer programs for the analysis of marine magnetic data

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, M.M.M.; Lakshminarayana, S.; Murthy, K.S.R.; Subrahmanyam, A.S.

    stream_size 37575 stream_content_type text/plain stream_name Comput_Geosci_19_657.pdf.txt stream_source_info Comput_Geosci_19_657.pdf.txt Content-Encoding UTF-8 Content-Type text/plain; charset=UTF-8 Computers & Geosciem...'es Vol. 19, No. 5, pp. 657-672, 1993 0098-3004/93 $6.00 + 0.00 Printed in Great Britain. All rights reserved Copyright (' 1993 Pergamon Press Ltd TWO COMPUTER PROGRAMS FOR THE ANALYSIS OF MARINE MAGNETIC DATA M. M. MALLESWARA RAO, S...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  20. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  1. Thermal and chemical evolution in the early solar system as recorded by FUN CAIs: Part I - Petrology, mineral chemistry, and isotopic composition of Allende FUN CAI CMS-1

    Science.gov (United States)

    Williams, C. D.; Ushikubo, T.; Bullock, E. S.; Janney, P. E.; Hines, R. R.; Kita, N. T.; Hervig, R. L.; MacPherson, G. J.; Mendybaev, R. A.; Richter, F. M.; Wadhwa, M.

    2017-03-01

    Detailed petrologic, geochemical and isotopic analyses of a new FUN CAI from the Allende CV3 meteorite (designated CMS-1) indicate that it formed by extensive melting and evaporation of primitive precursor material(s). The precursor material(s) condensed in a 16O-rich region (δ17O and δ18O ∼ -49‰) of the inner solar nebula dominated by gas of solar composition at total pressures of ∼10-3-10-6 bar. Subsequent melting of the precursor material(s) was accompanied by evaporative loss of magnesium, silicon and oxygen resulting in large mass-dependent isotope fractionations in these elements (δ25Mg = 30.71-39.26‰, δ29Si = 14.98-16.65‰, and δ18O = -41.57 to -15.50‰). This evaporative loss resulted in a bulk composition similar to that of compact Type A and Type B CAIs, but very distinct from the composition of the original precursor condensate(s). Kinetic fractionation factors and the measured mass-dependent fractionation of silicon and magnesium in CMS-1 suggest that ∼80% of the silicon and ∼85% of the magnesium were lost from its precursor material(s) through evaporative processes. These results suggest that the precursor material(s) of normal and FUN CAIs condensed in similar environments, but subsequently evolved under vastly different conditions such as total gas pressure. The chemical and isotopic differences between normal and FUN CAIs could be explained by sorting of early solar system materials into distinct physical and chemical regimes, in conjunction with discrete heating events, within the protoplanetary disk.

  2. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    resolution. 2.2 CT Image Analysis An algorithm was developed in Matlab † that performed image analysis on each individual cross-sectional image of the...be obtained with a minimum number of descriptor parameters, effectively reducing † Matlab (matrix...single core of an Intel Xeon X5650 processor operating at 2.67 GHz. To batch process the (210) projectiles, a Matlab script was written to parallelize

  3. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Science.gov (United States)

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  4. Revision of the Oriental leafhopper genus Destinoides Cai & He (Hemiptera: Cicadellidae: Ledrinae), with a new synonym and two new combinations.

    Science.gov (United States)

    Sun, Jing; Webb, Michael D; Zhang, Yalin

    2014-01-01

    The leafhopper genus Destinoides Cai & He is revised to include two species D. latifrons (Walker 1851, Ledra) n. comb. and D. conspicuus (Distant 1907, Petalocephala) n. comb. Destinoides fasciata Cai & He, 2000 is placed as a junior synonym of D. latifrons, syn. nov. These two species are redescribed and illustrated in detail and a key is given based on the males.

  5. Gender Role, Gender Identity and Sexual Orientation in CAIS ("XY-Women") Compared With Subfertile and Infertile 46,XX Women.

    Science.gov (United States)

    Brunner, Franziska; Fliegner, Maike; Krupp, Kerstin; Rall, Katharina; Brucker, Sara; Richter-Appelt, Hertha

    2016-01-01

    The perception of gender development of individuals with complete androgen insensitivity syndrome (CAIS) as unambiguously female has recently been challenged in both qualitative data and case reports of male gender identity. The aim of the mixed-method study presented was to examine the self-perception of CAIS individuals regarding different aspects of gender and to identify commonalities and differences in comparison with subfertile and infertile XX-chromosomal women with diagnoses of Mayer-Rokitansky-Küster-Hauser syndrome (MRKHS) and polycystic ovary syndrome (PCOS). The study sample comprised 11 participants with CAIS, 49 with MRKHS, and 55 with PCOS. Gender identity was assessed by means of a multidimensional instrument, which showed significant differences between the CAIS group and the XX-chromosomal women. Other-than-female gender roles and neither-female-nor-male sexes/genders were reported only by individuals with CAIS. The percentage with a not exclusively androphile sexual orientation was unexceptionally high in the CAIS group compared to the prevalence in "normative" women and the clinical groups. The findings support the assumption made by Meyer-Bahlburg ( 2010 ) that gender outcome in people with CAIS is more variable than generally stated. Parents and professionals should thus be open to courses of gender development other than typically female in individuals with CAIS.

  6. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  7. Evaluation of Imagine Learning English, a Computer-Assisted Instruction of Language and Literacy for Kindergarten Students

    Science.gov (United States)

    Longberg, Pauline Oliphant

    2012-01-01

    As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…

  8. Efficacy of Teachtown: Basics Computer-Assisted Intervention for the Intensive Comprehensive Autism Program in Los Angeles Unified School District

    Science.gov (United States)

    Whalen, Christina; Moss, Debbie; Ilan, Aaron B.; Vaupel, Manya; Fielding, Paul; MacDonald, Kevin; Cernich, Shannon; Symon, Jennifer

    2010-01-01

    Computer Assisted Instruction (CAI) has shown increased popularity recently and there are many studies showing promise for this approach for children with Autism Spectrum Disorders (ASD). However, there are no between-subject studies to date assessing the efficacy of CAI with this population. In this study, 47 preschool and K-1 students in ASD…

  9. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...... classification (as by using transport layer port numbers, Deep Packet Inspection (DPI), statistical classification) and assessed their usefulness in particular areas. We found that the classification techniques based on port numbers are not accurate anymore as most applications use dynamic port numbers, while...... DPI is relatively slow, requires a lot of processing power, and causes a lot of privacy concerns. Statistical classifiers based on Machine Learning Algorithms (MLAs) were shown to be fast and accurate. At the same time, they do not consume a lot of resources and do not cause privacy concerns. However...

  10. Computer Graphics in ChE Education.

    Science.gov (United States)

    Reklaitis, G. V.; And Others

    1983-01-01

    Examines current uses and future possibilities of computer graphics in chemical engineering, discussing equipment needs, maintenance/manpower costs, and plan to implement computer graphics into existing programs. The plan involves matching fund equipment grants, grants for development of computer assisted instructional (CAI) software, chemical…

  11. COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS

    OpenAIRE

    2010-01-01

    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  12. Local spatial frequency analysis for computer vision

    Science.gov (United States)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  13. Acoustic analysis of a computer cooling fan

    Science.gov (United States)

    Huang, Lixi; Wang, Jian

    2005-10-01

    Noise radiated by a typical computer cooling fan is investigated experimentally and analyzed within the framework of rotor-stator interaction noise using point source formulation. The fan is 9 cm in rotor casing diameter and its design speed is 3000 rpm. The main noise sources are found and quantified; they are (a) the inlet flow distortion caused by the sharp edges of the incomplete bellmouth due to the square outer framework, (b) the interaction of rotor blades with the downstream struts which hold the motor, and (c) the extra size of one strut carrying electrical wiring. Methods are devised to extract the rotor-strut interaction noise, (b) and (c), radiated by the component forces of drag and thrust at the leading and higher order spinning pressure modes, as well as the leading edge noise generated by (a). By re-installing the original fan rotor in various casings, the noises radiated by the three features of the original fan are separated, and details of the directivity are interpreted. It is found that the inlet flow distortion and the unequal set of four struts make about the same amount of noise. Their corrections show a potential of around 10-dB sound power reduction.

  14. Computational Analysis of Safety Injection Tank Performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Oan; Nietiadia, Yohanes Setiawan; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of); Addad, Yacine; Yoon, Ho Joon [Khalifa University of Science Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-10-15

    The APR 1400 is a large pressurized water reactor (PWR). Just like many other water reactors, it has an emergency core cooling system (ECCS). One of the most important components in the ECCS is the safety injection tank (SIT). Inside the SIT, a fluidic device is installed, which passively controls the mass flow of the safety injection and eliminates the need for low pressure safety injection pumps. As more passive safety mechanisms are being pursued, it has become more important to understand flow structure and the loss mechanism within the fluidic device. Current computational fluid dynamics (CFD) calculations have had limited success in predicting the fluid flow accurately. This study proposes to find a more exact result using CFD and more realistic modeling. The SIT of APR1400 was analyzed using MARS and CFD. CFD calculation was executed first to obtain the form loss factor. Using the two form loss factors from the vendor and calculation, calculation using MARS was performed to compare with experiment. The accumulator model in MARS was quite accurate in predicting the water level. The pipe model showed some difference with the experimental data in the water level.

  15. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  16. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  17. Computational Understanding: Analysis of Sentences and Context

    Science.gov (United States)

    1974-05-01

    to take English texts, disambxguate the words and semantic relationships in- volved, and settle questions like anaphoric reference, to the point...rather than what the word in isolation might mean. Tfit theory of text analysis ?.l*o stresses binding by predictions. To assume that a word is...cluster is basically the bundle of predictions and structures, knowledge that can bind a ’.ext into a unit. The cluster has much the same theoretical

  18. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  19. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  20. Consumption of fa cai Nostoc soup: a potential for BMAA exposure from Nostoc cyanobacteria in China?

    Science.gov (United States)

    Roney, Britton R; Renhui, Li; Banack, Sandra Anne; Murch, Susan; Honegger, Rosmarie; Cox, Paul Alan

    2009-01-01

    Grown in arid regions of western China the cyanobacterium Nostoc flagelliforme--called fa cai in Mandarin and fat choy in Cantonese--is wild-harvested and used to make soup consumed during New Year's celebrations. High prices, up to $125 USD/kg, led to overharvesting in Inner Mongolia, Ningxia, Gansu, Qinghai, and Xinjiang. Degradation of arid ecosystems, desertification, and conflicts between Nostoc harvesters and Mongol herdsmen concerned the Chinese environmental authorities, leading to a government ban of Nostoc commerce. This ban stimulated increased marketing of a substitute made from starch. We analysed samples purchased throughout China as well as in Chinese markets in the United States and the United Kingdom. Some were counterfeits consisting of dyed starch noodles. A few samples from California contained Nostoc flagelliforme but were adulterated with starch noodles. Other samples, including those from the United Kingdom, consisted of pure Nostoc flagelliforme. A recent survey of markets in Cheng Du showed no real Nostoc flagelliforme to be marketed. Real and artificial fa cai differ in the presence of beta-N-methylamino-L-alanine (BMAA). Given its status as a high-priced luxury food, the government ban on collection and marketing, and the replacement of real fa cai with starch substitutes consumed only on special occasions, it is anticipated that dietary exposure to BMAA from fa cai will be reduced in the future in China.

  1. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    DEFF Research Database (Denmark)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.

    2014-01-01

    and gas-melt oxygen-isotope exchange in a 16O-poor gaseous reservoir that resulted in crystallization of 16O-depleted fassaite, melilite and plagioclase. The final oxygen isotopic compositions of melilite and plagioclase in the CV FUN CAIs may have been established on the CV parent asteroid as a result...

  2. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  3. A computer analysis of the Schreber Memoirs.

    Science.gov (United States)

    Klein, R H

    1976-06-01

    With the aid of a computerized system for content analysis, WORDS, the complete Schreber Memoirs was subjected to various multivariate reduction techniques in order to investigate the major content themes of this document. The findings included the prevalence of somatic concerns throughout the Memoirs, clear references to persecutory ideas and to Schreber's assumption of a redemptive role, complex encapsulated concerns about Schreber's relationship with God, a lack of any close relationship between sexuality and sexual transformation either to themes of castration or procreation, and the fact that neither sun, God, nor Flechsig was significantly associated with clusters concerning gender, sexuality, or castration. These findings are discussed in relation to psychodynamic interpretations furnished by prior investigators who employed different research methods.

  4. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  5. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  6. High-Throughput Proteomic Approaches to the Elucidation of Potential Biomarkers of Chronic Allograft Injury (CAI

    Directory of Open Access Journals (Sweden)

    Hilary Cassidy

    2013-09-01

    Full Text Available This review focuses on the role of OMICs technologies, concentrating in particular on proteomics, in biomarker discovery in chronic allograft injury (CAI. CAI is the second most prevalent cause of allograft dysfunction and loss in the first decade post-transplantation, after death with functioning graft (DWFG. The term CAI, sometimes referred to as chronic allograft nephropathy (CAN, describes the deterioration of renal allograft function and structure as a result of immunological processes (chronic antibody-mediated rejection, and other non-immunological factors such as calcineurin inhibitor (CNI induced nephrotoxicity, hypertension and infection. Current methods for assessing allograft function are costly, insensitive and invasive; traditional kidney function measurements such as serum creatinine and glomerular filtration rate (GFR display poor predictive abilities, while the current “gold-standard” involving histological diagnosis with a renal biopsy presents its own inherent risks to the overall health of the allograft. As early as two years post-transplantation, protocol biopsies have shown more than 50% of allograft recipients have mild CAN; ten years post-transplantation more than 50% of the allograft recipients have progressed to severe CAN which is associated with diminishing graft function. Thus, there is a growing medical requirement for minimally invasive biomarkers capable of identifying the early stages of the disease which would allow for timely intervention. Proteomics involves the study of the expression, localization, function and interaction of the proteome. Proteomic technologies may be powerful tools used to identify novel biomarkers which would predict CAI in susceptible individuals. In this paper we will review the use of proteomics in the elucidation of novel predictive biomarkers of CAI in clinical, animal and in vitro studies.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  8. COMPUTING

    CERN Document Server

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  10. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  11. Parallel computation of seismic analysis of high arch dam

    Institute of Scientific and Technical Information of China (English)

    Chen Houqun; Ma Huaifa; Tu Jin; Cheng Guangqing; Tang Juzhen

    2008-01-01

    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.

  12. Large-scale temporal analysis of computer and information science

    Science.gov (United States)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  14. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  15. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  16. Computational Analysis of the SRS Phase III Salt Disposition Alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Dimenna, R.A.

    1999-10-07

    Completion of the Phase III evaluation and comparison of salt disposition alternatives was supported with enhanced computer models and analysis for each case on the ''short list'' of four options. SPEEDUP(TM) models and special purpose models describing mass and energy balances and flow rates were developed and used to predict performance and production characteristics for each of the options. Results from the computational analysis were a key part of the input used to select a primary and an alternate salt disposition alternative.

  17. First Experiences with LHC Grid Computing and Distributed Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  18. Rigorous computer analysis of the Chow-Robbins game

    CERN Document Server

    Häggström, Olle

    2012-01-01

    Flip a coin repeatedly, and stop whenever you want. Your payoff is the proportion of heads, and you wish to maximize this payoff in expectation. This so-called Chow-Robbins game is amenable to computer analysis, but while simple-minded number crunching can show that it is best to continue in a given position, establishing rigorously that stopping is optimal seems at first sight to require "backward induction from infinity". We establish a simple upper bound on the expected payoff in a given position, allowing efficient and rigorous computer analysis of positions early in the game. In particular we confirm that with 5 heads and 3 tails, stopping is optimal.

  19. Effects of Individual versus Paired/Cooperative Computer-Assisted Instruction on the Effectiveness and Efficiency of an In-Service Training Lesson.

    Science.gov (United States)

    Makuch, Joseph R.; And Others

    1992-01-01

    Describes a study that compared individual computer-assisted instruction (CAI) with paired/cooperative CAI as a method of providing inservice training for cooperative extension agents on the topic of proper water well location and construction. Cognitive achievement and time spent on the lesson are investigated. (18 references) (LRW)

  20. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  1. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  2. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  4. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  5. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  7. Finite element dynamic analysis on CDC STAR-100 computer

    Science.gov (United States)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  8. Computer analysis of shells of revolution using asymptotic results

    Science.gov (United States)

    Steele, C. R.; Ranjan, G. V.; Goto, C.; Pulliam, T. H.

    1979-01-01

    It is suggested that asymptotic results for the behavior of thin shells can be incorporated in a general computer code for the analysis of a complex shell structure. The advantage when compared to existing finite difference or finite element codes is a substantial reduction in computational labor with the capability of working to a specified level of accuracy. A reduction in user preparation time and dependance on user judgment is also gained, since mesh spacing can be internally generated. The general theory is described in this paper, as well as the implementation in the computer code FAST 1 (Functional Algorithm for Shell Theory) for the analysis of the general axisymmetric shell structure with axisymmetric loading.

  9. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  10. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  11. Computing support for advanced medical data analysis and imaging

    CERN Document Server

    Wiślicki, W; Białas, P; Czerwiński, E; Kapłon, Ł; Kochanowski, A; Korcyl, G; Kowal, J; Kowalski, P; Kozik, T; Krzemień, W; Molenda, M; Moskal, P; Niedźwiecki, S; Pałka, M; Pawlik, M; Raczyński, L; Rudy, Z; Salabura, P; Sharma, N G; Silarski, M; Słomski, A; Smyrski, J; Strzelecki, A; Wieczorek, A; Zieliński, M; Zoń, N

    2014-01-01

    We discuss computing issues for data analysis and image reconstruction of PET-TOF medical scanner or other medical scanning devices producing large volumes of data. Service architecture based on the grid and cloud concepts for distributed processing is proposed and critically discussed.

  12. 土壤链霉菌CaiF1抗菌物质的分离纯化及活性研究%Studies on Purification and Activity of Antibacterial Substances Derived from Soil Streptomyces sp.CaiF1

    Institute of Scientific and Technical Information of China (English)

    杨辉; 张茜; 曾建民; 李少华; 谭志远

    2011-01-01

    [Objective] Purification and activity of antibacterial substances derived from soil Streptomyces sp. CaiFl were studied. [Method] The antibacterial substances were separated and purified by Ethyl acetate extraction, macroporous adsorptive resin, silica gel chromatography and preparative high performance liquid chromatography (HPLC) , taking Staphylococcus aureus and Powdery mildew as activity indicating bacterial. [ Result ] Antibacterial substances were purified and the stability analysis of the extracts form Streptomyces CaiFl fermentation broth showed stable at pH 2.0 - 10.0, 100 ℃. And changed very little under UV treatment for 24 h. Inhibition rate of powdery mildew was 98.37%. [Conclusion] The purified antibacterial substances showed good stability, which provided theoretical foundation for their structural identifications and future applications.%[目的 ]从土壤链霉菌CaiF1发酵液中分离纯化抗菌物质,并研究该物质的活性.[方法] 采用乙酸乙酯萃取、大孔吸附树脂吸附、硅胶层析和制备型高效液相色谱(HPLC)等分离技术纯化抗菌物质,以金黄色葡萄球菌、白粉病菌为指示菌进行活性研究.[结果] 土壤链霉菌CaiF1发酵液中抗菌物质得到分离纯化;该物质在pH 2.0~10.0、100℃和紫外照射24h条件下抑菌活性几乎不变,对白粉病的防治效果达69.7%.[结论] 分离纯化的土壤链霉菌CaiF1抗菌物质具有很好的稳定性,为其结构鉴定和开发应用提供理论依据.

  13. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  14. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  15. CAI Ding-fang——An Outstanding Neurologist of Integrative Medicine

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    @@ Prof. CAI Ding-fang, Ph.D., was born on Nov. 8, 1956 in Shanghai. In 1988, Prof. CAI received the Ph.D. degree from the Nanjing University of Traditional Chinese Medicine. He built an academic foundation of a strong Chinese medicine (CM) & integrative medicine research following Prof. SHEN Zi-yin, an academician of the Chinese Academy of Sciences, and accomplished a training course aiming specifically at the solid and broad basic theory, as well as the deep and systemic expertise. He then worked as a visiting scholar in the Medical Center of the University of Tokushima, Japan from July 1990 to July 1991, and as a post-Ph.D. in Toyama Medical and Pharmaceutical University from September 1994 to September 1995.

  16. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    Science.gov (United States)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  17. Computer Vision-Based Image Analysis of Bacteria.

    Science.gov (United States)

    Danielsen, Jonas; Nordenfelt, Pontus

    2017-01-01

    Microscopy is an essential tool for studying bacteria, but is today mostly used in a qualitative or possibly semi-quantitative manner often involving time-consuming manual analysis. It also makes it difficult to assess the importance of individual bacterial phenotypes, especially when there are only subtle differences in features such as shape, size, or signal intensity, which is typically very difficult for the human eye to discern. With computer vision-based image analysis - where computer algorithms interpret image data - it is possible to achieve an objective and reproducible quantification of images in an automated fashion. Besides being a much more efficient and consistent way to analyze images, this can also reveal important information that was previously hard to extract with traditional methods. Here, we present basic concepts of automated image processing, segmentation and analysis that can be relatively easy implemented for use with bacterial research.

  18. Computer Analysis Of ILO Standard Chest Radiographs Of Pneumoconiosis

    Science.gov (United States)

    Li, C. C.; Shu, David B. C.; Tai, H. T.; Hou, W.; Kunkle, G. A.; Wang, Y.; Hoy, R. J.

    1982-11-01

    This paper presents study of computer analysis of the 1980 ILO standard chest radiographs of pneumoconiosis. Algorithms developed for detection of individual small rounded and irregular opacities have been experimented and evaluated on these standard radiographs. The density, shape, and size distribution of the detected objects in the lung field, in spite of false positives, can be used as indicators for the beginning of pneumoconiosis. This approach is potentially useful in computer-assisted screening and early detection process where the annual chest radiograph of each worker is compared with his (her) own normal radiograph obtained previously.

  19. EST analysis pipeline: use of distributed computing resources.

    Science.gov (United States)

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  20. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  1. Experimental investigation of CAI combustion in a two-stroke poppet valve DI engine

    OpenAIRE

    Zhang, Yan

    2015-01-01

    This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University London Due to their ability to simultaneously reduce fuel consumption and NOx emissions, Controlled Auto Ignition (CAI) and HCCI combustion processes have been extensively researched over the last decade and adopted on prototype gasoline engines. These combustion processes were initially achieved on conventional two-stroke ported gasoline engines, but there have been significantly fewer stu...

  2. Dilution effects on the controlled auto-ignition (CAI) combustion of hydrocarbon and alcohol fuels

    OpenAIRE

    Oakley, A.; Zhao, H.; Ma, T.; Ladommatos, N

    2001-01-01

    Copyright © 2001 SAE International. This paper is posted on this site with permission from SAE International. Further use of this paper is not permitted without permission from SAE This paper presents results from an experimental programme researching the in-cylinder conditions necessary to obtain homogenous CAI (or HCCI) combustion in a 4-stroke engine. The fuels under investigation include three blends of Unleaded Gasoline, a 95 RON Primary Reference Fuel, Methanol, and Ethanol. This wor...

  3. CAR: A MATLAB Package to Compute Correspondence Analysis with Rotations

    Directory of Open Access Journals (Sweden)

    Urbano Lorenzo-Seva Rovira

    2009-09-01

    Full Text Available Correspondence analysis (CA is a popular method that can be used to analyse relationships between categorical variables. Like principal component analysis, CA solutions can be rotated both orthogonally and obliquely to simple structure without affecting the total amount of explained inertia. We describe a MATLAB package for computing CA. The package includes orthogonal and oblique rotation of axes. It is designed not only for advanced users of MATLAB but also for beginners. Analysis can be done using a user-friendly interface, or by using command lines. We illustrate the use of CAR with one example.

  4. Formation of Refractory Metal Alloys and Their Occurrence in CAIs

    Science.gov (United States)

    Schwander, D.; Berg, T.; Ott, U.; Schönhense, G.; Palme, H.

    2012-09-01

    At the conference we will give an overview of the current state of our research of RMN from Murchison, Allende and Acfer 094 including statistical analysis of their compositions and structures in relation to condensation calculations.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. Numerical investigation of CAI Combustion in the Opposed- Piston Engine with Direct and Indirect Water Injection

    Science.gov (United States)

    Pyszczek, R.; Mazuro, P.; Teodorczyk, A.

    2016-09-01

    This paper is focused on the CAI combustion control in a turbocharged 2-stroke Opposed-Piston (OP) engine. The barrel type OP engine arrangement is of particular interest for the authors because of its robust design, high mechanical efficiency and relatively easy incorporation of a Variable Compression Ratio (VCR). The other advantage of such design is that combustion chamber is formed between two moving pistons - there is no additional cylinder head to be cooled which directly results in an increased thermal efficiency. Furthermore, engine operation in a Controlled Auto-Ignition (CAI) mode at high compression ratios (CR) raises a possibility of reaching even higher efficiencies and very low emissions. In order to control CAI combustion such measures as VCR and water injection were considered for indirect ignition timing control. Numerical simulations of the scavenging and combustion processes were performed with the 3D CFD multipurpose AVL Fire solver. Numerous cases were calculated with different engine compression ratios and different amounts of directly and indirectly injected water. The influence of the VCR and water injection on the ignition timing and engine performance was determined and their application in the real engine was discussed.

  9. CFD Analysis and Design Optimization Using Parallel Computers

    Science.gov (United States)

    Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James

    1997-01-01

    A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.

  10. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per

    2016-01-01

    -cost embedded computer with very limited computational resources as compared to an ordinary PC. The system succeeds in counting honeybees, identifying their position and measuring their in-and-out activity. Our algorithm uses background subtraction method to segment the images. After the segmentation stage......, the methods are primarily based on statistical analysis and inference. The regression statistics (i.e. R2) of the comparisons of system predictions and manual counts are 0.987 for counting honeybees, and 0.953 and 0.888 for measuring in-activity and out-activity, respectively. The experimental results...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  11. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    Science.gov (United States)

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community.

  12. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  13. Emerging Trends and Statistical Analysis in Computational Modeling in Agriculture

    Directory of Open Access Journals (Sweden)

    Sunil Kumar

    2015-03-01

    Full Text Available In this paper the authors have tried to describe emerging trend in computational modelling used in the sphere of agriculture. Agricultural computational modelling with the use of intelligence techniques for computing the agricultural output by providing minimum input data to lessen the time through cutting down the multi locational field trials and also the labours and other inputs is getting momentum. Development of locally suitable integrated farming systems (IFS is the utmost need of the day, particularly in India where about 95% farms are under small and marginal holding size. Optimization of the size and number of the various enterprises to the desired IFS model for a particular set of agro-climate is essential components of the research to sustain the agricultural productivity for not only filling the stomach of the bourgeoning population of the country, but also to enhance the nutritional security and farms return for quality life. Review of literature pertaining to emerging trends in computational modelling applied in field of agriculture is done and described below for the purpose of understanding its trends mechanism behavior and its applications. Computational modelling is increasingly effective for designing and analysis of the system. Computa-tional modelling is an important tool to analyses the effect of different scenarios of climate and management options on the farming systems and its interaction among themselves. Further, authors have also highlighted the applications of computational modeling in integrated farming system, crops, weather, soil, climate, horticulture and statistical used in agriculture which can show the path to the agriculture researcher and rural farming community to replace some of the traditional techniques.

  14. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  15. Assessing computer waste generation in Chile using material flow analysis.

    Science.gov (United States)

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  16. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  17. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  18. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    Science.gov (United States)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  19. Towards Advanced Data Analysis by Combining Soft Computing and Statistics

    CERN Document Server

    Gil, María; Sousa, João; Verleysen, Michel

    2013-01-01

    Soft computing, as an engineering science, and statistics, as a classical branch of mathematics, emphasize different aspects of data analysis. Soft computing focuses on obtaining working solutions quickly, accepting approximations and unconventional approaches. Its strength lies in its flexibility to create models that suit the needs arising in applications. In addition, it emphasizes the need for intuitive and interpretable models, which are tolerant to imprecision and uncertainty. Statistics is more rigorous and focuses on establishing objective conclusions based on experimental data by analyzing the possible situations and their (relative) likelihood. It emphasizes the need for mathematical methods and tools to assess solutions and guarantee performance. Combining the two fields enhances the robustness and generalizability of data analysis methods, while preserving the flexibility to solve real-world problems efficiently and intuitively.

  20. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  1. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; Ramesh, N.; K. Vijayalakshmi; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  2. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  3. 羽毛球男子双打运动员蔡赟/傅海峰、郑在成/李龙大技术运用对比分析%Comparative Analysis Of The Technique Used By Men's Badminton Doubles Players Cai Yun/Fu Haifeng,Zheng Zaicheng/Li Long Da

    Institute of Scientific and Technical Information of China (English)

    黄卓

    2012-01-01

    This article through the methods of video observation,mathematical statistics and so on,made comparative studies of the technology characteristics of six international badminton tournaments in men's doubles players in China(Cai Yun / Fu Haifeng) and South Korean men's doubles players(Zheng Zaicheng / Lee Yong Dae) in 2011.The research shows that an opponent has been directed at Fu Haifeng killed again assigned tactical preparation,reducing the pick ball technology use,weakened Fu Haifeng 's attacking threat;net drop both averaged differ from small,notably in the net twisting put the ball technology,net ball quality,net ball circuit and placement changes,and predict each other.Zheng Zaicheng's defensive consciousness is strong,drive fast ball speed,pitch changes,landing.%通过录像观察法、数理统计法等对2011所进行的6项国际羽毛球赛事中中国男双运动员(蔡斌贝/傅海峰)和韩国男双运动员(郑在成/李龙大)的技术特点进行对比研究,研究显示:对手已经针对傅海峰的重杀布置了技战术准备,减少了挑球等技术的使用,减弱了付海峰的进攻威胁;网前球双方场均得分相差距不大,值得注意的是在网前搓放球技术的运用中,网前球质量,网前球线路和落点的变化,以及预判对方球路的能力上都不如对手;郑在成/李龙大防守反击的意识强,平抽快挡球的速度快、落点刁、球路变化多。

  4. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  5. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  6. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  7. Computer Use, Confidence, Attitudes, and Knowledge: A Causal Analysis.

    Science.gov (United States)

    Levine, Tamar; Donitsa-Schmidt, Smadar

    1998-01-01

    Introduces a causal model which links measures of computer experience, computer-related attitudes, computer-related confidence, and perceived computer-based knowledge. The causal model suggests that computer use has a positive effect on perceived computer self-confidence, as well as on computer-related attitudes. Questionnaires were administered…

  8. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  9. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  10. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  12. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    Science.gov (United States)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  13. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  14. Design and Implement Method of Discrete Mathematics CAI Teaching Component%离散数学CAI课件的设计和实现方法

    Institute of Scientific and Technical Information of China (English)

    闫浮; 岳利明

    2001-01-01

    离散数学虽然是计算机专业课的基础课程,但是抽象难懂,为了加强大家对这门课程的理解,本文作者开发了离散数学的辅助教学软件。在这篇文章中主要从离散数学教学软件的课件层次结构出发讨论辅助教学软件的自适应性。%Discrete mathematics is a basic course in computer teaching purpose, yet it is abstract and difficult to be understood. The discrete mathematics CAI software is developed to help the more comprehension on this course. The article mainly discusses the software self-adapting performance from the view of hierarchical structure of the teaching component.

  15. Computational analysis of RNA structures with chemical probing data.

    Science.gov (United States)

    Ge, Ping; Zhang, Shaojie

    2015-06-01

    RNAs play various roles, not only as the genetic codes to synthesize proteins, but also as the direct participants of biological functions determined by their underlying high-order structures. Although many computational methods have been proposed for analyzing RNA structures, their accuracy and efficiency are limited, especially when applied to the large RNAs and the genome-wide data sets. Recently, advances in parallel sequencing and high-throughput chemical probing technologies have prompted the development of numerous new algorithms, which can incorporate the auxiliary structural information obtained from those experiments. Their potential has been revealed by the secondary structure prediction of ribosomal RNAs and the genome-wide ncRNA function annotation. In this review, the existing probing-directed computational methods for RNA secondary and tertiary structure analysis are discussed.

  16. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  17. Analysis and computational dissection of molecular signature multiplicity.

    Directory of Open Access Journals (Sweden)

    Alexander Statnikov

    2010-05-01

    Full Text Available Molecular signatures are computational or mathematical models created to diagnose disease and other phenotypes and to predict clinical outcomes and response to treatment. It is widely recognized that molecular signatures constitute one of the most important translational and basic science developments enabled by recent high-throughput molecular assays. A perplexing phenomenon that characterizes high-throughput data analysis is the ubiquitous multiplicity of molecular signatures. Multiplicity is a special form of data analysis instability in which different analysis methods used on the same data, or different samples from the same population lead to different but apparently maximally predictive signatures. This phenomenon has far-reaching implications for biological discovery and development of next generation patient diagnostics and personalized treatments. Currently the causes and interpretation of signature multiplicity are unknown, and several, often contradictory, conjectures have been made to explain it. We present a formal characterization of signature multiplicity and a new efficient algorithm that offers theoretical guarantees for extracting the set of maximally predictive and non-redundant signatures independent of distribution. The new algorithm identifies exactly the set of optimal signatures in controlled experiments and yields signatures with significantly better predictivity and reproducibility than previous algorithms in human microarray gene expression datasets. Our results shed light on the causes of signature multiplicity, provide computational tools for studying it empirically and introduce a framework for in silico bioequivalence of this important new class of diagnostic and personalized medicine modalities.

  18. Computing the surveillance error grid analysis: procedure and examples.

    Science.gov (United States)

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings.

  19. Constraints on the Origin of Chondrules and CAIs from Short-Lived and Long-Lived Radionuclides

    Energy Technology Data Exchange (ETDEWEB)

    Kita, N T; Huss, G R; Tachibana, S; Amelin, Y; Nyquist, L E; Hutcheon, I D

    2005-10-24

    The high time resolution Pb-Pb ages and short-lived nuclide based relative ages for CAIs and chondrules are reviewed. The solar system started at 4567.2 {+-} 0.6Ma inferred from the high precision Pb-Pb ages of CAIs. Time scales of CAIs ({le}0.1Myr), chondrules (1-3Myr), and early asteroidal differentiation ({ge}3Myr) inferred from {sup 26}Al relative ages are comparable to the time scale estimated from astronomical observations of young star; proto star, classical T Tauri star and week-lined T Tauri star, respectively. Pb-Pb ages of chondrules also indicate chondrule formation occur within 1-3 Myr after CAIs. Mn-Cr isochron ages of chondrules are similar to or within 2 Myr after CAI formation. Chondrules from different classes of chondrites show the same range of {sup 26}Al ages in spite of their different oxygen isotopes, indicating that chondrule formed in the localized environment. The {sup 26}Al ages of chondrules in each chondrite class show a hint of correlation with their chemical compositions, which implies the process of elemental fractionation during chondrule formation events.

  20. New Evidence for 26Al in CAI and Chondrules from Type 3 Ordinary Chondrites

    Science.gov (United States)

    Srinivasan, G.; Russell, S. S.; MacPherson, G. J.; Huss, G. R.; Wasserburg, G. J.

    1996-03-01

    We have known since 1976 that 26A1 (tl/2 = 7.2 x 105 yrs) was alive in the early solar system, at a level of (26Al/27Al)o z 5 x 10-5 in calcium-aluminum inclusions (CAI). However, several outstanding questions remain. Little evidence for 26A1 has been found in other chondritic material, and none has been found in differentiated meteorites. These results might imply that 26A1 was heterogeneously distributed in the nebula or by mineralogic site in nebular dust, or they might reflect differences in time of formation. There are strict limitations on finding evidence of 26A1 in normal chondrules with bulk Al/Mg ~ 0.1, since even quenched, perfectly preserved, late-stage glasses would have low Al/Mg. Primary plagioclase crystals provide the only possibility, but these only crystallize rarely in melts within the compositional range of normal chondrules. Also, metamorphism can erase the evidence in high-AI/Mg phases. To address these issues, we have conducted a search for chondrules and CAI with high-Al/Mg phases suitable for ion-probe measurement in type 3 ordinary chondrites. Previous work has revealed evidence for 26Al in a plagioclase bearing, olivine-pyroxene class from Semarkona (LL3.0; (26Al/27Al)o = 7.7+/-2.1 x 10-6)), a plagioclase-rich object from Bovedy (L3.7?; 2.5+/-1.2 x 10-7), in separated plagioclase from St. Marguerite (H4; 2.0+/-0.6 x 10-7), an isolated hibonite grain from Dhajala (H3.8; 8.4+0.5 x 10-6), and in Al2O3 and hibonite grains ((26Al/27Al)o = 2-5 x 10-5; [GRH, unpublished]) from acid residues of Semarkona, Bishunpur (LL3.1), and Krymka (LL3.1). We have identified and measured Al-Mg isotope systematics in two CAI and seven chondrules from ordinary chondrites of low metamorphic grade and have found clear evidence for 26A1 in both CAI and in two chondrules.

  1. The Clinical Experiences of Dr.CAI Gan in Treating Chronic Constipation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zheng-li; ZHU Mei-ping; LIU Qun; LEI Yun-xia

    2009-01-01

    @@ Prof.CAI Gan (蔡淦) is an academic leader in TCM treatment of the spleen and stomach disease.He insisted that liver depression, spleen deficiency and poor nourishment of the intestines are the core of pathogenesis for chronic constipation.Therefore he often treats the disease by strengthening the spleen,relieving the depressed liver, nourishing yin and moistening the intestines.Meanwhile he attaches great importance to syndrome differentiation and comprehensive regulation and treatment.As a result,good therapeutic effects are often achieved.The authors summarized his ways for treating chronic constipation with the following 10 methods, which are introduced below.

  2. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  3. Vortex dominated flows. Analysis and computation for multiple scale phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Ting, L. [New York Univ., NY (United States). Courant Inst. of Mathematical Sciences; Klein, R. [Freie Univ. Berlin (Germany). Fachbereich Mathematik und Informatik; Knio, O.M. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Mechanical Engineering

    2007-07-01

    This monograph provides in-depth analyses of vortex dominated flows via matched and multiscale asymptotics, and demonstrates how insight gained through these analyses can be exploited in the construction of robust, efficient, and accurate numerical techniques. The book explores the dynamics of slender vortex filaments in detail, including fundamental derivations, compressible core structure, weakly non-linear limit regimes, and associated numerical methods. Similarly, the volume covers asymptotic analysis and computational techniques for weakly compressible flows involving vortex-generated sound and thermoacoustics. The book is addressed to both graduate students and researchers. (orig.)

  4. Computational geometry assessment for morphometric analysis of the mandible.

    Science.gov (United States)

    Raith, Stefan; Varga, Viktoria; Steiner, Timm; Hölzle, Frank; Fischer, Horst

    2017-01-01

    This paper presents a fully automated algorithm for geometry assessment of the mandible. Anatomical landmarks could be reliably detected and distances were statistically evaluated with principal component analysis. The method allows for the first time to generate a mean mandible shape with statistically valid geometrical variations based on a large set of 497 CT-scans of human mandibles. The data may be used in bioengineering for designing novel oral implants, for planning of computer-guided surgery, and for the improvement of biomechanical models, as it is shown that commercially available mandible replicas differ significantly from the mean of the investigated population.

  5. Parameter estimation and error analysis in environmental modeling and computation

    Science.gov (United States)

    Kalmaz, E. E.

    1986-01-01

    A method for the estimation of parameters and error analysis in the development of nonlinear modeling for environmental impact assessment studies is presented. The modular computer program can interactively fit different nonlinear models to the same set of data, dynamically changing the error structure associated with observed values. Parameter estimation techniques and sequential estimation algorithms employed in parameter identification and model selection are first discussed. Then, least-square parameter estimation procedures are formulated, utilizing differential or integrated equations, and are used to define a model for association of error with experimentally observed data.

  6. Computational issue in the analysis of adaptive control systems

    Science.gov (United States)

    Kosut, Robert L.

    1989-01-01

    Adaptive systems under slow parameter adaption can be analyzed by the method of averaging. This provides a means to assess stability (and instability) properties of most adaptive systems, either continuous-time or (more importantly for practice) discrete-time, as well as providing an estimate of the region of attraction. Although the method of averaging is conceptually straightforward, even simple examples are well beyond hand calculations. Specific software tools are proposed which can provide the basis for user-friendly environment to perform the necessary computations involved in the averaging analysis.

  7. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  8. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  9. Analysis of Network Performance for Computer Communication Systems with Benchmark

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper introduced a performance evaluating approach of computer communication system based on the simulation and measurement technology, and discussed its evaluating models. The result of our experiment showed that the outcome of practical measurement on Ether-LAN fitted in well with the theoreticai analysis. The approach we presented can be used to define various kinds of artificially simulated load models conveniently, build all kinds of network application environments in a flexible way, and exert sufficientiy the widely-used and high-precision features of the traditional simulation technology and the reality,reliability, adaptability features of measurement technology.

  10. Plans for a sensitivity analysis of bridge-scour computations

    Science.gov (United States)

    Dunn, David D.; Smith, Peter N.

    1993-01-01

    Plans for an analysis of the sensitivity of Level 2 bridge-scour computations are described. Cross-section data from 15 bridge sites in Texas are modified to reflect four levels of field effort ranging from no field surveys to complete surveys. Data from United States Geological Survey (USGS) topographic maps will be used to supplement incomplete field surveys. The cross sections are used to compute the water-surface profile through each bridge for several T-year recurrence-interval design discharges. The effect of determining the downstream energy grade-line slope from topographic maps is investigated by systematically varying the starting slope of each profile. The water-surface profile analyses are then used to compute potential scour resulting from each of the design discharges. The planned results will be presented in the form of exceedance-probability versus scour-depth plots with the maximum and minimum scour depths at each T-year discharge presented as error bars.

  11. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.

  12. Computational analysis of light scattering from collagen fiber networks

    Science.gov (United States)

    Arifler, Dizem; Pavlova, Ina; Gillenwater, Ann; Richards-Kortum, Rebecca

    2007-07-01

    Neoplastic progression in epithelial tissues is accompanied by structural and morphological changes in the stromal collagen matrix. We used the Finite-Difference Time-Domain (FDTD) method, a popular computational technique for full-vector solution of complex problems in electromagnetics, to establish a relationship between structural properties of collagen fiber networks and light scattering, and to analyze how neoplastic changes alter stromal scattering properties. To create realistic collagen network models, we acquired optical sections from the stroma of fresh normal and neoplastic oral cavity biopsies using fluorescence confocal microscopy. These optical sections were then processed to construct three-dimensional collagen networks of different sizes as FDTD model input. Image analysis revealed that volume fraction of collagen fibers in the stroma decreases with neoplastic progression, and statistical texture features computed suggest that fibers tend to be more disconnected in neoplastic stroma. The FDTD modeling results showed that neoplastic fiber networks have smaller scattering cross-sections compared to normal networks of the same size, whereas high-angle scattering probabilities tend to be higher for neoplastic networks. Characterization of stromal scattering is expected to provide a basis to better interpret spectroscopic optical signals and to develop more reliable computational models to describe photon propagation in epithelial tissues.

  13. A Type of Computer-Assisted Instruction.

    Science.gov (United States)

    Ruplin, Ferdinand A.; Russell, John R.

    1968-01-01

    The experimental use of computer assisted instruction (CAI) to replace conventional laboratory sessions for first-year German students at the State University of New York, Stony Brook, is described. Materials used in the program are outlined, including hardware, pre-text program, text program, student manual, and diagnostics. Pedagogical…

  14. Intelligent CAI.

    Science.gov (United States)

    1975-10-01

    of industria !, technological, and sociological interest- invaluable information for executives and professionals who must keep up to date. The...BEVERAGE 1A HOT CHOCOLATE MB TEA IBI WITH LEMON HB2 WITH SUGAR AND CREAM 1C COFFEE 11 SUBSTITUTE WOR’ JN &TATCIKMT Before we end this

  15. Analysis of CERN computing infrastructure and monitoring data

    Science.gov (United States)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  16. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  17. Dynamic analysis of spur gears using computer program DANST

    Science.gov (United States)

    Oswald, Fred B.; Lin, Hsiang H.; Liou, Chuen-Huei; Valco, Mark J.

    1993-06-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results.

  18. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis....... Factor I mediates the overall severity of the disturbance, factor II points to expressive versus comprehensive character of the language disorder, factor III represents the granularity of the phonetic mistakes, factor IV accentuates the patients' awareness of his disease, and factor V exposes...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...

  19. Analysis on Phase Transformation (ATP) Using Computational Thermal Principles (CTP)

    Institute of Scientific and Technical Information of China (English)

    N.Alagurmurthi; K.Palaniradja; V. Soundararajan

    2004-01-01

    Computer analysis based on computational thermal principles to predict the transformation kinetics in steels at varying temperatures is of great practical importance in different areas of heat treatment. As a result, using the theory of transient state heat conduction with convective boundary conditions, an efficient program named "ATP" (Analysis on Phase Transformation) has been developed to determine the temperature distribution under different quenching conditions for different geometries such as plate, cylinder and sphere. In addition to these the microstructures and the corresponding hardness developed during quenching are predicted using Time Temperature Transformation (TTT) diagram incorporated in the analysis. To approve our work, dilation curves, Heisler charts and time-temperature history curve have been generated. This paper deals with basic objective of the program (ATP) determination of temperature, microstructure and hardness distribution and also includes an online prediction of austenite-pearlite and austenite-martensite transformation in steels along with the corresponding retained fractions. The quenching of a cylinder in gases, liquids and liquid metals is analyzed to show the non-liner effect of cylinder diameter on the temperature and microstructures. Further in the program we have considered a typical 1080 steel cylinders quenched in water for predicting and comparing the program results with experimental values and can be extended even to other grades of steels. The numerical results of program are found to be in good agreement with the experimental data obtained. Finally the quenching process analysis described in the study appears to be a promising tool for the design of heat-treatment process parameters for steels.

  20. The CDF Computing and Analysis System:First Experience

    Institute of Scientific and Technical Information of China (English)

    RickColombo; FedorRatnikov; 等

    2001-01-01

    The Collider Detector at Fermilab(CDF) collaboration records and analyses proton anti-proton interactions with a center-of -mass energy of 2 TeV at the Tevatron,A new collider run,Run II,of the Tevatron started in April.During its more than two year duration the CDF experiment expects to record about 1 PetaByte of data.With its multi-purpose detector and center-of mass energy at the frontier,the experimental program is large and versatile.The over 500 scientists of CDF will engage in searches for new particles,like the Higgs boson or supersymmetric particles,precision measurement of electroweak parameters,like the mass of the W boson,measurement of top quark parameters and a large spectrum of B physics.The experiment has taken data and analysed them in previous runs.For Run II,however,the computing model was changed to incorporate new methodologies.the file format switched.and both data handling and analysis system redesigned to cope with the increased demands.This paper(4-036 at Chep 2001)gives an overview of the CDF Run II compute system with emphasize on areas where the current system does not match initial estimates and projections.For the data handling and analysis system a more detailed description is given.

  1. Wind energy conversion system analysis model (WECSAM) computer program documentation

    Energy Technology Data Exchange (ETDEWEB)

    Downey, W T; Hendrick, P L

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation. Thus, any user-supplied data for WECS performance, application load, utility rates, or wind resource may be entered into the scratch file to override the default data-base value. After the model and the inputs required from the user and derived from the data base are described, the model output and the various output options that can be exercised by the user are detailed. The general operation is set forth and suggestions are made for efficient modes of operation. Sample listings of various input, output, and data-base files are appended. (LEW)

  2. Computational Particle Physics for Event Generators and Data Analysis

    CERN Document Server

    Perret-Gallix, Denis

    2013-01-01

    High-energy physics data analysis relies heavily on the comparison between experimental and simulated data as stressed lately by the Higgs search at LHC and the recent identification of a Higgs-like new boson. The first link in the full simulation chain is the event generation both for background and for expected signals. Nowadays event generators are based on the automatic computation of matrix element or amplitude for each process of interest. Moreover, recent analysis techniques based on the matrix element likelihood method assign probabilities for every event to belong to any of a given set of possible processes. This method originally used for the top mass measurement, although computing intensive, has shown its power at LHC to extract the new boson signal from the background. Serving both needs, the automatic calculation of matrix element is therefore more than ever of prime importance for particle physics. Initiated in the eighties, the techniques have matured for the lowest order calculations (tree-le...

  3. Ca-Fe and Alkali-Halide Alteration of an Allende Type B CAI: Aqueous Alteration in Nebular or Asteroidal Settings

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Simon, S. B.; Grossman, L.

    2012-01-01

    Ca-Fe and alkali-halide alteration of CAIs is often attributed to aqueous alteration by fluids circulating on asteroidal parent bodies after the various chondritic components have been assembled, although debate continues about the roles of asteroidal vs. nebular modification processes [1-7]. Here we report de-tailed observations of alteration products in a large Type B2 CAI, TS4 from Allende, one of the oxidized subgroup of CV3s, and propose a speculative model for aqueous alteration of CAIs in a nebular setting. Ca-Fe alteration in this CAI consists predominantly of end-member hedenbergite, end-member andradite, and compositionally variable, magnesian high-Ca pyroxene. These phases are strongly concentrated in an unusual "nodule" enclosed within the interior of the CAI (Fig. 1). The Ca, Fe-rich nodule superficially resembles a clast that pre-dated and was engulfed by the CAI, but closer inspection shows that relic spinel grains are enclosed in the nodule, and corroded CAI primary phases interfinger with the Fe-rich phases at the nodule s margins. This CAI also contains abundant sodalite and nepheline (alkali-halide) alteration that occurs around the rims of the CAI, but also penetrates more deeply into the CAI. The two types of alteration (Ca-Fe and alkali-halide) are adjacent, and very fine-grained Fe-rich phases are associated with sodalite-rich regions. Both types of alteration appear to be replacive; if that is true, it would require substantial introduction of Fe, and transport of elements (Ti, Al and Mg) out of the nodule, and introduction of Na and Cl into alkali-halide rich zones. Parts of the CAI have been extensively metasomatized.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  5. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  6. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  7. Caiçaras, caboclos and natural resources: rules and scale patterns Caiçaras, caboclos e recursos naturais: regras e padrões de escala

    Directory of Open Access Journals (Sweden)

    Alpina Begossi

    1999-12-01

    Full Text Available One important question concerning the sustainability of local or native populations refers to their interaction with local and global institutions. We should expect that populations with capacity to interact economically and politically with institutions, might show a better chance for their ecological and cultural continuity, as well as for their system of trade and subsistence. The level of ecological and social interaction of local populations, following concepts from ecology, occurs on different scales: for example, from the territories of individual fishermen on the Atlantic Forest coast to organizations of community Extractive Reserves in the Amazon. The scale of organization (individual/family/community may influence the capacity to deal with institutions. This study analyses how Brazilian native populations, especially caiçaras of the Atlantic Forest coast, and caboclos from the Amazon, have interacted with regional, national and global institutions, concerning environmental demands. Concepts such as common management, natural capital, resilience and sustainability are useful to understand these illustrative cases.Uma questão importante da sustentabilidade de populações locais ou nativas se refere à interação com as instituições locais e globais. Podemos esperar que populações que demonstrem capacidade de interagir de forma econômica e política com as instituições apresentem também uma chance maior de continuidade cultural e ecológica, assim como de seus sistemas de troca e subsistência. O nível da interação ecológica e social das populações locais, seguindo conceitos da ecologia, ocorrem sob escalas diferentes: por exemplo, dos territórios individuais de pescadores da Mata Atlântica às organizações de comunidades em Reservas Extrativistas, na Amazônia. A escala organizacional (individual/familiar/comunitária pode influenciar a capacidade de lidar com as instituições.Esse estudo analisa como popula

  8. Computational analysis of bacterial RNA-Seq data.

    Science.gov (United States)

    McClure, Ryan; Balasubramanian, Divya; Sun, Yan; Bobrovskyy, Maksym; Sumby, Paul; Genco, Caroline A; Vanderpool, Carin K; Tjaden, Brian

    2013-08-01

    Recent advances in high-throughput RNA sequencing (RNA-seq) have enabled tremendous leaps forward in our understanding of bacterial transcriptomes. However, computational methods for analysis of bacterial transcriptome data have not kept pace with the large and growing data sets generated by RNA-seq technology. Here, we present new algorithms, specific to bacterial gene structures and transcriptomes, for analysis of RNA-seq data. The algorithms are implemented in an open source software system called Rockhopper that supports various stages of bacterial RNA-seq data analysis, including aligning sequencing reads to a genome, constructing transcriptome maps, quantifying transcript abundance, testing for differential gene expression, determining operon structures and visualizing results. We demonstrate the performance of Rockhopper using 2.1 billion sequenced reads from 75 RNA-seq experiments conducted with Escherichia coli, Neisseria gonorrhoeae, Salmonella enterica, Streptococcus pyogenes and Xenorhabdus nematophila. We find that the transcriptome maps generated by our algorithms are highly accurate when compared with focused experimental data from E. coli and N. gonorrhoeae, and we validate our system's ability to identify novel small RNAs, operons and transcription start sites. Our results suggest that Rockhopper can be used for efficient and accurate analysis of bacterial RNA-seq data, and that it can aid with elucidation of bacterial transcriptomes.

  9. The future of computer-aided sperm analysis

    Directory of Open Access Journals (Sweden)

    Sharon T Mortimer

    2015-01-01

    Full Text Available Computer-aided sperm analysis (CASA technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6 and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards.

  10. Computer-Assisted Intervention for Children with Low Numeracy Skills

    Science.gov (United States)

    Rasanen, Pekka; Salminen, Jonna; Wilson, Anna J.; Aunio, Pirjo; Dehaene, Stanislas

    2009-01-01

    We present results of a computer-assisted intervention (CAI) study on number skills in kindergarten children. Children with low numeracy skill (n = 30) were randomly allocated to two treatment groups. The first group played a computer game (The Number Race) which emphasized numerical comparison and was designed to train number sense, while the…

  11. Estado nutricional y adecuación del menú en los CAI de Villa Gesell

    OpenAIRE

    2014-01-01

    Los Centros de Atención Integral (C.A.I) son unidades pertenecientes al PROMIN en los que se combinan prestaciones pedagógicas y de estimulación con complementación alimentaria en áreas de alta concentración de pobreza estructural. Objetivos: Evaluar el estado nutricional de los niños que asisten a los C.A.I. de Villa Gesell y la adecuación del menú brindado en estos a las necesidades nutricionales. Material y Método: Se procedió a tomar mediciones antropométricas tales como...

  12. Gender Differences in Computer-Related Attitudes and Behavior: A Meta-Analysis.

    Science.gov (United States)

    Whitley, Bernard E., Jr.

    1997-01-01

    A meta-analysis of studies of gender differences in computer attitudes and behavior found that males exhibited greater sex-role stereotyping of computers, higher computer self-efficacy, and more positive attitudes toward computers than females. Most differences in attitudes and behavior were small, with the largest found in high school students.…

  13. Computational Analysis on Stent Geometries in Carotid Artery: A Review

    Science.gov (United States)

    Paisal, Muhammad Sufyan Amir; Taib, Ishkrizat; Ismail, Al Emran

    2017-01-01

    This paper reviews the work done by previous researchers in order to gather the information for the current study which about the computational analysis on stent geometry in carotid artery. The implantation of stent in carotid artery has become popular treatment for arterial diseases of hypertension such as stenosis, thrombosis, atherosclerosis and embolization, in reducing the rate of mortality and morbidity. For the stenting of an artery, the previous researchers did many type of mathematical models in which, the physiological variables of artery is analogized to electrical variables. Thus, the computational fluid dynamics (CFD) of artery could be done, which this method is also did by previous researchers. It lead to the current study in finding the hemodynamic characteristics due to artery stenting such as wall shear stress (WSS) and wall shear stress gradient (WSSG). Another objective of this study is to evaluate the nowadays stent configuration for full optimization in reducing the arterial side effect such as restenosis rate after a few weeks of stenting. The evaluation of stent is based on the decrease of strut-strut intersection, decrease of strut width and increase of the strut-strut spacing. The existing configuration of stents are actually good enough in widening the narrowed arterial wall but the disease such as thrombosis still occurs in early and late stage after the stent implantation. Thus, the outcome of this study is the prediction for the reduction of restenosis rate and the WSS distribution is predicted to be able in classifying which stent configuration is the best.

  14. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  15. Computational Approach to Dendritic Spine Taxonomy and Shape Transition Analysis

    Science.gov (United States)

    Bokota, Grzegorz; Magnowska, Marta; Kuśmierczyk, Tomasz; Łukasik, Michał; Roszkowska, Matylda; Plewczynski, Dariusz

    2016-01-01

    The common approach in morphological analysis of dendritic spines of mammalian neuronal cells is to categorize spines into subpopulations based on whether they are stubby, mushroom, thin, or filopodia shaped. The corresponding cellular models of synaptic plasticity, long-term potentiation, and long-term depression associate the synaptic strength with either spine enlargement or spine shrinkage. Although a variety of automatic spine segmentation and feature extraction methods were developed recently, no approaches allowing for an automatic and unbiased distinction between dendritic spine subpopulations and detailed computational models of spine behavior exist. We propose an automatic and statistically based method for the unsupervised construction of spine shape taxonomy based on arbitrary features. The taxonomy is then utilized in the newly introduced computational model of behavior, which relies on transitions between shapes. Models of different populations are compared using supplied bootstrap-based statistical tests. We compared two populations of spines at two time points. The first population was stimulated with long-term potentiation, and the other in the resting state was used as a control. The comparison of shape transition characteristics allowed us to identify the differences between population behaviors. Although some extreme changes were observed in the stimulated population, statistically significant differences were found only when whole models were compared. The source code of our software is freely available for non-commercial use1. Contact: d.plewczynski@cent.uw.edu.pl. PMID:28066226

  16. Application of Computer Integration Technology for Fire Safety Analysis

    Institute of Scientific and Technical Information of China (English)

    SHI Jianyong; LI Yinqing; CHEN Huchuan

    2008-01-01

    With the development of information technology, the fire safety assessment of whole structure or region based on the computer simulation has become a hot topic. However, traditionally, the concemed studies are performed separately for different objectives and difficult to perform an overall evaluation. A new multi-dimensional integration model and methodology for fire safety assessment were presented and two newly developed integrated systems were introduced to demonstrate the function of integration simulation technology in this paper. The first one is the analysis on the fire-resistant behaviors of whole structure under real fire loads. The second one is the study on fire evaluation and emergency rescue of campus based on geography information technology (GIS). Some practical examples are presented to illuminate the advan-tages of computer integration technology on fire safety assessment and emphasize some problems in the simulation. The results show that the multi-dimensional integration model offers a new way and platform for the integrating fire safety assessment of whole structure or region, and the integrated software developed is the useful engineering tools for cost-saving and safe design.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  18. Multiresolution analysis over simple graphs for brain computer interfaces

    Science.gov (United States)

    Asensio-Cubero, J.; Gan, J. Q.; Palaniappan, R.

    2013-08-01

    Objective. Multiresolution analysis (MRA) offers a useful framework for signal analysis in the temporal and spectral domains, although commonly employed MRA methods may not be the best approach for brain computer interface (BCI) applications. This study aims to develop a new MRA system for extracting tempo-spatial-spectral features for BCI applications based on wavelet lifting over graphs. Approach. This paper proposes a new graph-based transform for wavelet lifting and a tailored simple graph representation for electroencephalography (EEG) data, which results in an MRA system where temporal, spectral and spatial characteristics are used to extract motor imagery features from EEG data. The transformed data is processed within a simple experimental framework to test the classification performance of the new method. Main Results. The proposed method can significantly improve the classification results obtained by various wavelet families using the same methodology. Preliminary results using common spatial patterns as feature extraction method show that we can achieve comparable classification accuracy to more sophisticated methodologies. From the analysis of the results we can obtain insights into the pattern development in the EEG data, which provide useful information for feature basis selection and thus for improving classification performance. Significance. Applying wavelet lifting over graphs is a new approach for handling BCI data. The inherent flexibility of the lifting scheme could lead to new approaches based on the hereby proposed method for further classification performance improvement.

  19. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.

  20. Importance sampling. I. Computing multimodel p values in linkage analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kong, A.; Frigge, M.; Irwin, M.; Cox, N. (Univ. of Chicago, IL (United States))

    1992-12-01

    In linkage analysis, when the lod score is maximized over multiple genetic models, standard asymptotic approximation of the significance level does not apply. Monte Carlo methods can be used to estimate the p value, but procedures currently used are extremely inefficient. The authors propose a Monte Carlo procedure based on the concept of importance sampling, which can be thousands of times more efficient than current procedures. With a reasonable amount of computing time, extremely accurate estimates of the p values can be obtained. Both theoretical results and an example of maturity-onset diabetes of the young (MODY) are presented to illustrate the efficiency performance of their method. Relations between single-model and multimodel p values are explored. The new procedure is also used to investigate the performance of asymptotic approximations in a single model situation. 22 refs., 6 figs., 1 tab.

  1. Meta-Analysis and Computer-Mediated Communication.

    Science.gov (United States)

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed.

  2. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  3. Computational analysis of azine-N-oxides as energetic materials

    Energy Technology Data Exchange (ETDEWEB)

    Ritchie, J.P.

    1994-05-01

    A BKW equation of state in a 1-dimensional hydrodynamic simulation of the cylinder test can be used to estimate the performance of explosives. Using this approach, the novel explosive 1,4-diamino-2,3,5,6-tetrazine-2,5-dioxide (TZX) was analyzed. Despite a high detonation velocity and a predicted CJ pressure comparable to that of RDX, TZX performs relatively poorly in the cylinder test. Theoretical and computational analysis shows this to be the result of a low heat of detonation. A conceptual strategy is proposed to remedy this problem. In order to predict the required heats of formation, new ab initio group equivalents were developed. Crystal structure calculations are also described that show hydrogen-bonding is important in determining the density of TZX and related compounds.

  4. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  5. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  6. Computational Fluid Dynamics Analysis of an Evaporative Cooling System

    Directory of Open Access Journals (Sweden)

    Kapilan N.

    2016-11-01

    Full Text Available The use of chlorofluorocarbon based refrigerants in the air-conditioning system increases the global warming and causes the climate change. The climate change is expected to present a number of challenges for the built environment and an evaporative cooling system is one of the simplest and environmentally friendly cooling system. The evaporative cooling system is most widely used in summer and in rural and urban areas of India for human comfort. In evaporative cooling system, the addition of water into air reduces the temperature of the air as the energy needed to evaporate the water is taken from the air. Computational fluid dynamics is a numerical analysis and was used to analyse the evaporative cooling system. The CFD results are matches with the experimental results.

  7. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  8. Triclosan Computational Conformational Chemistry Analysis for Antimicrobial Properties in Polymers.

    Science.gov (United States)

    Petersen, Richard C

    2015-03-01

    Triclosan is a diphenyl ether antimicrobial that has been analyzed by computational conformational chemistry for an understanding of Mechanomolecular Theory. Subsequent energy profile analysis combined with easily seen three-dimensional chemistry structure models for the nonpolar molecule Triclosan show how single bond rotations can alternate rapidly at a polar and nonpolar interface. Bond rotations for the center ether oxygen atom of the two aromatic rings then expose or hide nonbonding lone-pair electrons for the oxygen atom depending on the polar nature of the immediate local molecular environment. Rapid bond movements can subsequently produce fluctuations as vibration energy. Consequently, related mechanical molecular movements calculated as energy relationships by forces acting through different bond positions can help improve on current Mechanomolecular Theory. A previous controversy reported as a discrepancy in literature contends for a possible bacterial resistance from Triclosan antimicrobial. However, findings in clinical settings have not reported a single case for Triclosan bacterial resistance in over 40 years that has been documented carefully in government reports. As a result, Triclosan is recommended whenever there is a health benefit consistent with a number of approvals for use of Triclosan in healthcare devices. Since Triclosan is the most researched antimicrobial ever, literature meta analysis with computational chemistry can best describe new molecular conditions that were previously impossible by conventional chemistry methods. Triclosan vibrational energy can now explain the molecular disruption of bacterial membranes. Further, Triclosan mechanomolecular movements help illustrate use in polymer matrix composites as an antimicrobial with two new additive properties as a toughening agent to improve matrix fracture toughness from microcracking and a hydrophobic wetting agent to help incorporate strengthening fibers. Interrelated

  9. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time; (3) software should enable grouping of individual sperm based on one or more attributes so outputs reflect subpopulations or clusters of similar sperm with unique

  10. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  11. Linking CAI abundance to polarimetric response in a population of ancient asteroids

    Science.gov (United States)

    Devogele, Maxime; Tanga, Paolo; Bendjoya, Philippe; Rivet, Jean-Pierre; Surdej, Jean; Bus, Schelte J.; Sunshine, Jessica M.; Cellino, Alberto; Campins, Humberto; Licandro, Javier; Pinilla-Alonso, Noemi; Carry, Benoit

    2016-10-01

    Polarimetry constitutes one of the fundamental tools for characterizing the surface texture and composition of airless Solar System bodies. In 2006, polarimetric observations led to the discovery of a new type of asteroids, which displays a peculiar polarimetric response. These asteroids are collectively known as "Barbarians", from (234) Barbara the first discovered one.The most commonly accepted explanation for this perculiar polarization response seems to be the presence of a high percentage of fluffy-type Calcium Aluminium-rich Inclusions (CAIs), whose optical properties could produce the observed polarization. Their reflectance spectra also exibit an absorption feature in the near-infrared around 2.1-2.2 microns, that is characteristic of this peculiar group.Based on these results, we organized a systematic polarimetric and near-infrared observational campaign of known Barbarians or candidate asteroids. These campaigns include members of the family of 1040 Klumpkea, 2085 Henan and 729 Watsonia, which are known to contain Barbarian and/or L-type asteroids also suspected to have such a polarimetric behaviour. We have made use of the ToPo polarimeter at the 1m telescope of the Centre pédagogique Planète et Univers (C2PU, Observatoire de la Côte d'Azur, France). The spectroscopic observations in the near-infrared were obtained with the SpeX instrument at the NASA's InfraRed Telescope Facility (IRTF).By combining polarimetry and spectroscopy we find a correlation between the abundance of CAIs and the inversion angle of the phase-polarization curve of Barbarian asteroids. This is the first time that a direct link has been established between a specific polarimetric response and the surface composition of asteroids. In addition, we find a considerable variety of CAI abundance from one object to the other, consistent with a wide range of possible albedos. Since these asteroids constitute a reservoir of primitive Solar System material, understanding their origin can

  12. The Impact of Different Forms of Multimedia CAI on Students' Science Achievement.

    Science.gov (United States)

    Chang, Chun-Yen

    2002-01-01

    Describes a study that explored the effects of teacher-centered versus student-centered multimedia computer-assisted instruction on the science achievements of tenth-grade students in Taiwan. Results of analysis of covariance on pretests-posttest scores showed the teacher-centered approach was more effective in promoting students' science…

  13. Quantitative Computed Tomography and image analysis for advanced muscle assessment

    Directory of Open Access Journals (Sweden)

    Kyle Joseph Edmunds

    2016-06-01

    Full Text Available Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration.

  14. Consequence analysis in LPG installation using an integrated computer package.

    Science.gov (United States)

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG

  15. Changes in flavour and microbial diversity during natural fermentation of suan-cai, a traditional food made in Northeast China.

    Science.gov (United States)

    Wu, Rina; Yu, Meiling; Liu, Xiaoyu; Meng, Lingshuai; Wang, Qianqian; Xue, Yating; Wu, Junrui; Yue, Xiqing

    2015-10-15

    We measured changes in the main physical and chemical properties, flavour compounds and microbial diversity in suan-cai during natural fermentation. The results showed that the pH and concentration of soluble protein initially decreased but were then maintained at a stable level; the concentration of nitrite increased in the initial fermentation stage and after reaching a peak it decreased significantly to a low level by the end of fermentation. Suan-cai was rich in 17 free amino acids. All of the free amino acids increased in concentration to different degrees, except histidine. Total free amino acids reached their highest levels in the mid-fermentation stage. The 17 volatile flavour components identified at the start of fermentation increased to 57 by the mid-fermentation stage; esters and aldehydes were in the greatest diversity and abundance, contributing most to the aroma of suan-cai. Bacteria were more abundant and diverse than fungi in suan-cai; 14 bacterial species were identified from the genera Leuconostoc, Bacillus, Pseudomonas and Lactobacillus. The predominant fungal species identified were Debaryomyces hansenii, Candida tropicalis and Penicillium expansum.

  16. From Corporate Social Responsibility, through Entrepreneurial Orientation, to Knowledge Sharing: A Study in Cai Luong (Renovated Theatre) Theatre Companies

    Science.gov (United States)

    Tuan, Luu Trong

    2015-01-01

    Purpose: This paper aims to examine the role of antecedents such as corporate social responsibility (CSR) and entrepreneurial orientation in the chain effect to knowledge sharing among members of Cai Luong theatre companies in the Vietnamese context. Knowledge sharing contributes to the depth of the knowledge pool of both the individuals and the…

  17. Analysis on Cloud Computing Information Security Problems and the Countermeasures

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cloud computing is one of the most popular terms in the present IT industry, as well as one of the most prosperous technology. This paper introduces the concept, principle and characteristics of cloud computing, analyzes information security problems resulted from cloud computing, and puts forward corresponding solutions.

  18. Two years since SSAMS: Status of {sup 14}C AMS at CAIS

    Energy Technology Data Exchange (ETDEWEB)

    Ravi Prasad, G.V.; Cherkinsky, Alexander; Culp, Randy A.; Dvoracek, Doug K.

    2015-10-15

    The NEC 250 kV single stage AMS accelerator (SSAMS) was installed two years ago at the Center for Applied Isotope Studies (CAIS), University of Georgia. The accelerator is primarily being used for radiocarbon measurements to test the authenticity of natural and bio-based samples while all other samples such as geological, atmospheric, marine and archaeological. are run on the 500 kV, NEC 1.5SDH-1 model tandem accelerator, which has been operating since 2001. The data obtained over a six months period for OXI, OXII, ANU sucrose and FIRI-D are discussed. The mean value of ANU sucrose observed to be slightly lower than the consensus value. The processed blanks on SSAMS produce lower apparent age compared to the tandem accelerator as expected.

  19. Computational electromagnetic analysis of plasmonic effects in interdigital photodetectors

    Science.gov (United States)

    Hill, Avery M.; Nusir, Ahmad I.; Nguyen, Paul V.; Manasreh, Omar M.; Herzog, Joseph B.

    2014-09-01

    Plasmonic nanostructures have been shown to act as optical antennas that enhance optical devices. This study focuses on computational electromagnetic (CEM) analysis of GaAs photodetectors with gold interdigital electrodes. Experiments have shown that the photoresponse of the devices depend greatly on the electrode spacing and the polarization of the incident light. Smaller electrode spacing and transverse polarization give rise to a larger photoresponse. This computational study will simulate the optical properties of these devices to determine what plasmonic properties and optical enhancement these devices may have. The models will be solving Maxwell's equations with a finite element method (FEM) algorithm provided by the software COMSOL Multiphysics 4.4. The preliminary results gathered from the simulations follow the same trends that were seen in the experimental data collected, that the spectral response increases when the electrode spacing decreases. Also the simulations show that incident light with the electric field polarized transversely across the electrodes produced a larger photocurrent as compared with longitudinal polarization. This dependency is similar to other plasmonic devices. The simulation results compare well with the experimental data. This work also will model enhancement effects in nanostructure devices with dimensions that are smaller than the current samples to lead the way for future nanoscale devices. By seeing the potential effects that the decreased spacing could have, it opens the door to a new set of devices on a smaller scale, potentially ones with a higher level of enhancement for these devices. In addition, the precise modeling and understanding of the effects of the parameters provides avenues to optimize the enhancement of these structures making more efficient photodetectors. Similar structures could also potentially be used for enhanced photovoltaics as well.

  20. Genome Assembly and Computational Analysis Pipelines for Bacterial Pathogens

    KAUST Repository

    Rangkuti, Farania Gama Ardhina

    2011-06-01

    Pathogens lie behind the deadliest pandemics in history. To date, AIDS pandemic has resulted in more than 25 million fatal cases, while tuberculosis and malaria annually claim more than 2 million lives. Comparative genomic analyses are needed to gain insights into the molecular mechanisms of pathogens, but the abundance of biological data dictates that such studies cannot be performed without the assistance of computational approaches. This explains the significant need for computational pipelines for genome assembly and analyses. The aim of this research is to develop such pipelines. This work utilizes various bioinformatics approaches to analyze the high-­throughput genomic sequence data that has been obtained from several strains of bacterial pathogens. A pipeline has been compiled for quality control for sequencing and assembly, and several protocols have been developed to detect contaminations. Visualization has been generated of genomic data in various formats, in addition to alignment, homology detection and sequence variant detection. We have also implemented a metaheuristic algorithm that significantly improves bacterial genome assemblies compared to other known methods. Experiments on Mycobacterium tuberculosis H37Rv data showed that our method resulted in improvement of N50 value of up to 9697% while consistently maintaining high accuracy, covering around 98% of the published reference genome. Other improvement efforts were also implemented, consisting of iterative local assemblies and iterative correction of contiguated bases. Our result expedites the genomic analysis of virulent genes up to single base pair resolution. It is also applicable to virtually every pathogenic microorganism, propelling further research in the control of and protection from pathogen-­associated diseases.

  1. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  2. 计算机辅助创新驱动的产品概念设计创新设想产生过程模型%Process model of new ideas generation for product conceptual design driven by CAI

    Institute of Scientific and Technical Information of China (English)

    张建辉; 檀润华; 张鹏; 曹国忠

    2013-01-01

    Generation of creative ideas was critical in the product conceptual design process. The obstacle to idea generation in the process was that desigers didn't make full use of knowledge in different fields. Theory of Inventive Problem Solving (TRIZ) based Computer-aided Innovation Systems (CAIs) provided a platform for knowledge application in different fields. Principles of creative idea generation driven by Computer-aided Innovation (CAI) was put forward, the inventive problem was solved based on the design scenario of CAIs, the extended solution space was set up by (Unexpected Discoveries)UXD which was implied in the source design in order to drive the generation of creative idea, then, an integrated process model of creative idea generation for product conceptual design driven by CAI was developed. Idea generation for a safety isolation butterfly valve was carried out using the process model and demonstrated its feasibility.%创新设想产生是产品概念设计阶段的关键环节,影响该阶段设想产生的障碍是设计人员不能很好地利用多学科领域的知识.鉴于此,基于发明问题解决理论的计算机辅助创新软件系统提供了应用多学科领域知识的一个平台.提出了计算机辅助创新驱动的创新设想产生原理,以计算辅助创新软件为设计场景进行问题求解,基于源设计中的未预见的发现建立扩展解空间,驱动创新设想产生,进而建立了计算机辅助创新驱动的产品概念设计创新设想产生过程模型.通过安全隔离蝶阀创新设想产生验证了该模型的可行性.

  3. Methodology for Benefit Analysis of CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) in USN Shipyards.

    Science.gov (United States)

    1984-03-01

    benefits of CAD/CAR and of the next generation technology, CIDER . The CADOS study (Ref. 13] offers a method to measure the intangibles of CAD/CAR...methodology that measures both tangible and intangible benefits of present CAD technology. This method would be hard to extend to CIDER technology because of...D-Ri38 398 METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM / (COMPUTER-HIDED DESIGN/COMPUTER-AIDED MANUFACTURING) IN USN SHIPYARDS(U) NAVAL POSTGRADUATE

  4. Design Principles for Computer-Assisted Instruction in Histology Education: An Exploratory Study

    Science.gov (United States)

    Deniz, Hasan; Cakir, Hasan

    2006-01-01

    The purpose of this paper is to describe the development process and the key components of a computer-assisted histology material. Computer-assisted histology material is designed to supplement traditional histology education in a large Midwestern university. Usability information of the computer-assisted instruction (CAI) material was obtained…

  5. Dietary Changes over Time in a Caiçara Community from the Brazilian Atlantic Forest

    Directory of Open Access Journals (Sweden)

    Priscila L. MacCord

    2006-12-01

    Full Text Available Because they are occurring at an accelerated pace, changes in the livelihoods of local coastal communities, including nutritional aspects, have been a subject of interest in human ecology. The aim of this study is to explore the dietary changes, particularly in the consumption of animal protein, that have taken place in Puruba Beach, a rural community of caiçaras on the São Paulo Coast, Brazil, over the 10-yr period from 1992-1993 to 2002-2003. Data were collected during six months in 1992-1993 and during the same months in 2002-2003 using the 24-hr recall method. We found an increasing dependence on external products in the most recent period, along with a reduction in fish consumption and in the number of fish species eaten. These changes, possibly associated with other nonmeasured factors such as overfishing and unplanned tourism, may cause food delocalization and a reduction in the use of natural resources. Although the consequences for conservation efforts in the Atlantic Forest and the survival of the caiçaras must still be evaluated, these local inhabitants may be finding a way to reconcile both the old and the new dietary patterns by keeping their houses in the community while looking for sources of income other than natural resources. The prospect shown here may reveal facets that can influence the maintenance of this and other communities undergoing similar processes by, for example, shedding some light on the ecological and economical processes that may occur within their environment and in turn affect the conservation of the resources upon which the local inhabitants depend.

  6. Trident: scalable compute archives: workflows, visualization, and analysis

    Science.gov (United States)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  7. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...

  8. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  9. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  10. Analisis cualitativo asistido por computadora Computer-assisted qualitative analysis

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2003-01-01

    Full Text Available Los objetivos de este ensayo son: por un lado, presentar una aproximación a la experiencia hispanoamericana en el Análisis Cualitativo Asistido por Computadora (ACAC al agrupar mediante un ejercicio de sistematización los trabajos realizados por diversos colegas provenientes de disciplinas afines. Aunque hubiese querido ser exhaustivo y minucioso, como cualquier intento de sistematización de experiencias, en este ejercicio son notables las ausencias y las omisiones. Introducir algunas reflexiones teóricas en torno al papel del ACAC en el desarrollo de la investigación cualitativa a partir de esa sistematización y con particular énfasis en la producción del dato es, por otro lado, objetivo central de esta primera aproximación.The aims of this article are: on the one hand, to present an approximation to the Hispano-American experience on Computer-Assisted Qualitative Data Analysis (CAQDAS, grouping as a systematization exercise the works carried out by several colleagues from related disciplines. Although attempting to be exhaustive and thorough - as in any attempt at systematizing experiences - this exercise presents clear lacks and omissions. On the other hand, to introduce some theoretical reflections about the role played by CAQDAS in the development of qualitative investigation after that systematization, with a specific focus on data generation.

  11. Design of airborne wind turbine and computational fluid dynamics analysis

    Science.gov (United States)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.

  12. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  13. Reliability and safety analysis of redundant vehicle management computer system

    Institute of Scientific and Technical Information of China (English)

    Shi Jian; Meng Yixuan; Wang Shaoping; Bian Mengmeng; Yan Dungong

    2013-01-01

    Redundant techniques are widely adopted in vehicle management computer (VMC) to ensure that VMC has high reliability and safety. At the same time, it makes VMC have special char-acteristics, e.g., failure correlation, event simultaneity, and failure self-recovery. Accordingly, the reliability and safety analysis to redundant VMC system (RVMCS) becomes more difficult. Aimed at the difficulties in RVMCS reliability modeling, this paper adopts generalized stochastic Petri nets to establish the reliability and safety models of RVMCS. Then this paper analyzes RVMCS oper-ating states and potential threats to flight control system. It is verified by simulation that the reli-ability of VMC is not the product of hardware reliability and software reliability, and the interactions between hardware and software faults can reduce the real reliability of VMC obviously. Furthermore, the failure undetected states and false alarming states inevitably exist in RVMCS due to the influences of limited fault monitoring coverage and false alarming probability of fault mon-itoring devices (FMD). RVMCS operating in some failure undetected states will produce fatal threats to the safety of flight control system. RVMCS operating in some false alarming states will reduce utility of RVMCS obviously. The results abstracted in this paper can guide reliable VMC and efficient FMD designs. The methods adopted in this paper can also be used to analyze other intelligent systems’ reliability.

  14. Computational analysis on plug-in hybrid electric motorcycle chassis

    Science.gov (United States)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  15. VARIABLE AND EXTREME IRRADIATION CONDITIONS IN THE EARLY SOLAR SYSTEM INFERRED FROM THE INITIAL ABUNDANCE OF {sup 10}Be IN ISHEYEVO CAIs

    Energy Technology Data Exchange (ETDEWEB)

    Gounelle, Matthieu [Laboratoire de Mineralogie et de Cosmochimie du Museum, CNRS and Museum National d' Histoire Naturelle, UMR 7202, CP52, 57 rue Cuvier, F-75005 Paris (France); Chaussidon, Marc; Rollion-Bard, Claire, E-mail: gounelle@mnhn.fr [Centre de Recherches Petrographiques et Geochimiques, CRPG-CNRS, BP 20, F-54501 Vandoeuvre-les-Nancy Cedex (France)

    2013-02-01

    A search for short-lived {sup 10}Be in 21 calcium-aluminum-rich inclusions (CAIs) from Isheyevo, a rare CB/CH chondrite, showed that only 5 CAIs had {sup 10}B/{sup 11}B ratios higher than chondritic correlating with the elemental ratio {sup 9}Be/{sup 11}B, suggestive of in situ decay of this key short-lived radionuclide. The initial ({sup 10}Be/{sup 9}Be){sub 0} ratios vary between {approx}10{sup -3} and {approx}10{sup -2} for CAI 411. The initial ratio of CAI 411 is one order of magnitude higher than the highest ratio found in CV3 CAIs, suggesting that the more likely origin of CAI 411 {sup 10}Be is early solar system irradiation. The low ({sup 26}Al/{sup 27}Al){sub 0} [{<=} 8.9 Multiplication-Sign 10{sup -7}] with which CAI 411 formed indicates that it was exposed to gradual flares with a proton fluence of a few 10{sup 19} protons cm{sup -2}, during the earliest phases of the solar system, possibly the infrared class 0. The irradiation conditions for other CAIs are less well constrained, with calculated fluences ranging between a few 10{sup 19} and 10{sup 20} protons cm{sup -2}. The variable and extreme value of the initial {sup 10}Be/{sup 9}Be ratios in carbonaceous chondrite CAIs is the reflection of the variable and extreme magnetic activity in young stars observed in the X-ray domain.

  16. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  17. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  19. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  20. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    Science.gov (United States)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  1. Computational modeling and impact analysis of textile composite structures

    Science.gov (United States)

    Hur, Hae-Kyu

    This study is devoted to the development of an integrated numerical modeling enabling one to investigate the static and the dynamic behaviors and failures of 2-D textile composite as well as 3-D orthogonal woven composite structures weakened by cracks and subjected to static-, impact- and ballistic-type loads. As more complicated modeling about textile composite structures is introduced, some of homogenization schemes, geometrical modeling and crack propagations become more difficult problems to solve. To overcome these problems, this study presents effective mesh-generation schemes, homogenization modeling based on a repeating unit cell and sinusoidal functions, and also a cohesive element to study micro-crack shapes. This proposed research has two: (1) studying behavior of textile composites under static loads, (2) studying dynamic responses of these textile composite structures subjected to the transient/ballistic loading. In the first part, efficient homogenization schemes are suggested to show the influence of textile architectures on mechanical characteristics considering the micro modeling of repeating unit cell. Furthermore, the structures of multi-layered or multi-phase composites combined with different laminar such as a sub-laminate, are considered to find the mechanical characteristics. A simple progressive failure mechanism for the textile composites is also presented. In the second part, this study focuses on three main phenomena to solve the dynamic problems: micro-crack shapes, textile architectures and textile effective moduli. To obtain a good solutions of the dynamic problems, this research attempts to use four approaches: (I) determination of governing equations via a three-level hierarchy: micro-mechanical unit cell analysis, layer-wise analysis accounting for transverse strains and stresses, and structural analysis based on anisotropic plate layers, (II) development of an efficient computational approach enabling one to perform transient

  2. Computer Models for IRIS Control System Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  3. Computer Majors' Education as Moral Enterprise: A Durkheimian Analysis.

    Science.gov (United States)

    Rigoni, David P.; Lamagdeleine, Donald R.

    1998-01-01

    Building on Durkheim's (Emile) emphasis on the moral dimensions of social reality and using it to explore contemporary computer education, contends that many of his claims are justified. Argues that the college computer department has created a set of images, maxims, and operating assumptions that frames its curriculum, courses, and student…

  4. High throughput computing: a solution for scientific analysis

    Science.gov (United States)

    O'Donnell, M.

    2011-01-01

    Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:

  5. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  6. Computational identification and analysis of novel sugarcane microRNAs

    Directory of Open Access Journals (Sweden)

    Thiebaut Flávia

    2012-07-01

    Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.

  7. Interface design of VSOP'94 computer code for safety analysis

    Science.gov (United States)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  8. Synthesis, spectral, computational and thermal analysis studies of metallocefotaxime antibiotics.

    Science.gov (United States)

    Masoud, Mamdouh S; Ali, Alaa E; Elasala, Gehan S

    2015-01-01

    Cefotaxime metal complexes of Cr(III), Mn(II), Fe(III), Co(II), Ni(II), Cu(II), Zn(II), Cd(II), Hg(II) and two mixed metals complexes of (Fe,Cu) and (Fe,Ni) were synthesized and characterized by elemental analysis, IR, electronic spectra, magnetic susceptibility and ESR spectra. The studies proved that cefotaxime may act as mono, bi, tri and tetra-dentate ligand through oxygen atoms of lactam carbonyl, carboxylic or amide carbonyl groups and nitrogen atom of thiazole ring. From the magnetic measurements and electronic spectral data, octahedral structures were proposed for all complexes. Quantum chemical methods have been performed for cefotaxime to calculate charges, bond lengths, bond angles, dihedral angles, electronegativity (χ), chemical potential (μ), global hardness (η), softness (σ) and the electrophilicity index (ω). The thermal decomposition of the prepared metals complexes was studied by TGA, DTA and DSC techniques. Thermogravimetric studies revealed the presence of lattice or coordinated water molecules in all the prepared complexes. The decomposition mechanisms were suggested. The thermal decomposition of the complexes ended with the formation of metal oxides and carbon residue as a final product except in case of Hg complex, sublimation occur at the temperature range 376.5-575.0 °C so, only carbon residue was produced during thermal decomposition. The orders of chemical reactions (n) were calculated via the peak symmetry method and the activation parameters were computed from the thermal decomposition data. The geometries of complexes may be converted from Oh to Td during the thermal decomposition steps.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  10. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  14. Olfactory cleft computed tomography analysis and olfaction in chronic rhinosinusitis

    Science.gov (United States)

    Kohli, Preeti; Schlosser, Rodney J.; Storck, Kristina

    2016-01-01

    Background: Volumetric analysis of the olfactory cleft by using computed tomography has been associated with olfaction in patients with chronic rhinosinusitis (CRS). However, existing studies have not comprehensively measured olfaction, and it thus remains unknown whether correlations differ across specific dimensions of odor perception. Objective: To use comprehensive measures of patient-reported and objective olfaction to evaluate the relationship between volumetric olfactory cleft opacification and olfaction. Methods: Olfaction in patients with CRS was evaluated by using “Sniffin' Sticks” tests and a modified version of the Questionnaire of Olfactory Disorders. Olfactory cleft opacification was quantified by using two- and three-dimensional, computerized volumetric analysis. Correlations between olfactory metrics and olfactory cleft opacification were then calculated. Results: The overall CRS cohort included 26 patients without nasal polyposis (CRSsNP) (68.4%) and 12 patients with nasal polyposis (CRSwNP) (31.6%). Across the entire cohort, total olfactory cleft opacification was 82.8%, with greater opacification in the CRSwNP subgroup compared with CRSsNP (92.3 versus 78.4%, p < 0.001). The percent total volume opacification correlated with the total Sniffin' Sticks score (r = −0.568, p < 0.001) as well as individual threshold, discrimination, and identification scores (p < 0.001 for all). Within the CRSwNP subgroup, threshold (r = −0.616, p = 0.033) and identification (r = −0.647, p = 0.023) remained highly correlated with total volume opacification. In patients with CRSsNP, the threshold correlated with total volume scores (r = −0.457, p = 0.019), with weaker and nonsignificant correlations for discrimination and identification. Correlations between total volume opacification and the Questionnaire of Olfactory Disorders were qualitatively similar to objective olfactory findings in both CRSwNP (r = −0.566, p = 0.070) and CRSsNP (r = −0.310, p

  15. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  16. Retrospective indexing (RI) - A computer-aided indexing technique

    Science.gov (United States)

    Buchan, Ronald L.

    1990-01-01

    An account is given of a method for data base-updating designated 'computer-aided indexing' (CAI) which has been very efficiently implemented at NASA's Scientific and Technical Information Facility by means of retrospective indexing. Novel terms added to the NASA Thesaurus will therefore proceed directly into both the NASA-RECON aerospace information system and its portion of the ESA-Information Retrieval Service, giving users full access to material thus indexed. If a given term appears in the title of a record, it is given special weight. An illustrative graphic representation of the CAI search strategy is presented.

  17. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    Science.gov (United States)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie; Atlas Collaboration

    2014-06-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  18. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis."

  19. Computer-based facial expression analysis for assessing user experience

    OpenAIRE

    Branco, Pedro

    2006-01-01

    Tese de Doutoramento em Tecnologias e Sistemas de Informação - Área de Especialização em Engenharia da Programação e dos Sistemas Informáticos For the majority of the users, computers are difficult and frustrating to use. The proliferation of computers in the daily life in all sort of shapes and forms becomes a significant factor for potentially aggravating and thus degrading the users’ acceptance of the technology. Traditional user observation methods, aiming at improving human-computer i...

  20. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  1. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  2. 18th International Workshop on Advanced Computing and Analysis Techniques in Physics Research

    CERN Document Server

    2017-01-01

    The 18th edition of ACAT will bring together experts to explore and confront the boundaries of computing, automated data analysis, and theoretical calculation technologies, in particle and nuclear physics, astronomy and astrophysics, cosmology, accelerator science and beyond. ACAT provides a unique forum where these disciplines overlap with computer science, allowing for the exchange of ideas and the discussion of cutting-edge computing, data analysis and theoretical calculation technologies in fundamental physics research.

  3. 论蔡元培的传记写作%On Cai Yuanpei' s Biographical Writing

    Institute of Scientific and Technical Information of China (English)

    赖勤芳

    2012-01-01

    Cai Yuanpei wrote a great many biographical writings in his life though he wasn' t famous for writing biography. In the period of his formative education and studying freely, he accumulated the attainment of biographical writing and cultivated gradually the sense of cultural identification based on Chinese ancient bi- ography. When embarking on political revolution and educational innovation, he became more active in bio- graphical writing, especially choosing those revolutionaries, relatives and friends as leading characters. In his later years, he wrote autobiography with a style of annals suggested by Hu Shi. All in all, it was very obvious that Cai should be a helpful promotion to modem transformation of Chinese biographical idea though he wasn' t determined biographical writer and researcher, including his lack of more profound biographical theories and very self-conscious sense of modem biographical style.%蔡元培并不是一位以写作传记而闻名的文学家,但在一生中写下了大量的传记作品。在接受启蒙教育和“自由”读书的过程中,他积累了良好的传记写作素养,逐渐形成了对中国传记文化的认同感。随着政治革命、教育革新等活动的展开,他更是积极写作各种人物传记,其中的革命家传和亲友传最具特色。晚年在胡适的影响下用年谱体写作自传。尽管蔡元培的志向并不在传记写作及研究上,此外亦无十分深刻的传记理论见解和非常自觉的现代传记文体意识,但对中国传记文学观念的现代转型具有一定的助推作用。

  4. Sensitivity Analysis and Error Control for Computational Aeroelasticity Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposal is the development of a next-generation computational aeroelasticity code, suitable for real-world complex geometries, and...

  5. Computational Interpretations of Analysis via Products of Selection Functions

    Science.gov (United States)

    Escardó, Martín; Oliva, Paulo

    We show that the computational interpretation of full comprehension via two well-known functional interpretations (dialectica and modified realizability) corresponds to two closely related infinite products of selection functions.

  6. Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing

    Science.gov (United States)

    Ozguner, Fusun

    1996-01-01

    Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.

  7. An analysis of symbolic linguistic computing models in decision making

    Science.gov (United States)

    Rodríguez, Rosa M.; Martínez, Luis

    2013-01-01

    It is common that experts involved in complex real-world decision problems use natural language for expressing their knowledge in uncertain frameworks. The language is inherent vague, hence probabilistic decision models are not very suitable in such cases. Therefore, other tools such as fuzzy logic and fuzzy linguistic approaches have been successfully used to model and manage such vagueness. The use of linguistic information implies to operate with such a type of information, i.e. processes of computing with words (CWW). Different schemes have been proposed to deal with those processes, and diverse symbolic linguistic computing models have been introduced to accomplish the linguistic computations. In this paper, we overview the relationship between decision making and CWW, and focus on symbolic linguistic computing models that have been widely used in linguistic decision making to analyse if all of them can be considered inside of the CWW paradigm.

  8. Finite Element Analysis in Concurrent Processing: Computational Issues

    Science.gov (United States)

    Sobieszczanski-Sobieski, Jaroslaw; Watson, Brian; Vanderplaats, Garrett

    2004-01-01

    The purpose of this research is to investigate the potential application of new methods for solving large-scale static structural problems on concurrent computers. It is well known that traditional single-processor computational speed will be limited by inherent physical limits. The only path to achieve higher computational speeds lies through concurrent processing. Traditional factorization solution methods for sparse matrices are ill suited for concurrent processing because the null entries get filled, leading to high communication and memory requirements. The research reported herein investigates alternatives to factorization that promise a greater potential to achieve high concurrent computing efficiency. Two methods, and their variants, based on direct energy minimization are studied: a) minimization of the strain energy using the displacement method formulation; b) constrained minimization of the complementary strain energy using the force method formulation. Initial results indicated that in the context of the direct energy minimization the displacement formulation experienced convergence and accuracy difficulties while the force formulation showed promising potential.

  9. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  10. Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.

    Science.gov (United States)

    Edwards, Thomas O.

    The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…

  11. Computer Assisted Instruction. A Report to the Board.

    Science.gov (United States)

    Stoneberg, Bert, Jr.

    This report presents the findings of an evaluation conducted in the Greater Albany Public School System 8J (Oregon) to determine the effects of computer assisted instruction (CAI) in mathematics as delivered by the WICAT System 300 at the Periwinkle Elementary School. Evaluation activities were designed and conducted to determine whether the…

  12. The Effectiveness of a Computer-Assisted Math Learning Program

    Science.gov (United States)

    De Witte, K.; Haelermans, C.; Rogge, N.

    2015-01-01

    Computer-assisted instruction (CAI) programs are considered as a way to improve learning outcomes of students. However, little is known on the schools who implement such programs as well as on the effectiveness of similar information and communication technology programs. We provide a literature review that pays special attention to the existing…

  13. Computer-Assisted Law Instruction: Clinical Education's Bionic Sibling

    Science.gov (United States)

    Henn, Harry G.; Platt, Robert C.

    1977-01-01

    Computer-assisted instruction (CAI), like clinical education, has considerable potential for legal training. As an initial Cornell Law School experiment, a lesson in applying different corporate statutory dividend formulations, with a cross-section of balance sheets and other financial data, was used to supplement regular class assignments.…

  14. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  15. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  16. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science. Final semiannual report, 1 April-30 September 1986

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  17. 基于Delphi的制图课件设计%Engineering Graphics CAI Design Based on Delphi

    Institute of Scientific and Technical Information of China (English)

    蒋先刚; 钟化兰; 涂晓斌

    2001-01-01

    Introduces programming technologies and methods of EngineeringGraphics CAI design based on Delphi, it focus on system configuration and software methods. It also presents methods and skills of using TTreeView component to construct the CAI'S database.%介绍基于Delphi开发环境下的制图课件的设计技术和实现方法。重点介绍画法几何与工程制图CAI课件系统的构造和软件实现方法,提出了用Delphi中的树状显示控件构造和管理制图CAI系统中的数据库设计方法和技巧。

  18. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  19. Nondestructive analysis of urinary calculi using micro computed tomography

    Directory of Open Access Journals (Sweden)

    Lingeman James E

    2004-12-01

    Full Text Available Abstract Background Micro computed tomography (micro CT has been shown to provide exceptionally high quality imaging of the fine structural detail within urinary calculi. We tested the idea that micro CT might also be used to identify the mineral composition of urinary stones non-destructively. Methods Micro CT x-ray attenuation values were measured for mineral that was positively identified by infrared microspectroscopy (FT-IR. To do this, human urinary stones were sectioned with a diamond wire saw. The cut surface was explored by FT-IR and regions of pure mineral were evaluated by micro CT to correlate x-ray attenuation values with mineral content. Additionally, intact stones were imaged with micro CT to visualize internal morphology and map the distribution of specific mineral components in 3-D. Results Micro CT images taken just beneath the cut surface of urinary stones showed excellent resolution of structural detail that could be correlated with structure visible in the optical image mode of FT-IR. Regions of pure mineral were not difficult to find by FT-IR for most stones and such regions could be localized on micro CT images of the cut surface. This was not true, however, for two brushite stones tested; in these, brushite was closely intermixed with calcium oxalate. Micro CT x-ray attenuation values were collected for six minerals that could be found in regions that appeared to be pure, including uric acid (3515 – 4995 micro CT attenuation units, AU, struvite (7242 – 7969 AU, cystine (8619 – 9921 AU, calcium oxalate dihydrate (13815 – 15797 AU, calcium oxalate monohydrate (16297 – 18449 AU, and hydroxyapatite (21144 – 23121 AU. These AU values did not overlap. Analysis of intact stones showed excellent resolution of structural detail and could discriminate multiple mineral types within heterogeneous stones. Conclusions Micro CT gives excellent structural detail of urinary stones, and these results demonstrate the feasibility

  20. 被历史错位的蔡襄书法%Cai Xiang's Calligraphy Dislocated by the History

    Institute of Scientific and Technical Information of China (English)

    黄志强

    2012-01-01

    宋代书法"尚意"的思潮,是一代文宗欧阳修、蔡襄和苏轼等人引领下形成的。蔡襄的书法和理论作为中国传统书法文化艺术的组成部分,具有承前启后的作用。然而在很长的历史时期里,学界在研究蔡襄的诗和书法时有"论者或不然",或认为"宋四大家"的书法是蔡京而非蔡襄等悖论。现根据史料,让我们以今人的眼光对蔡襄的书法进行整体的审视和评价。%The calligraphy in Song dynasty had the trend of "Shang Yi" under the guidance of Ouyang Xiu, Cai Xiang and Su Shi. Cai Xiang's calligraphy and theory, as one part of the traditional Chinese culture, have played an important role in linking the past and the future in Chinese calligraphy. However, there were opposite voices in the academic circles during a long time, and furthermore, Cai Jing, instead of Cai Xiang, is one of "four great calligraphers in Song dynasty", which is a fallacy in today's perspective. Now with the help of historical records, this paper gives overall review and evaluation on his works.

  1. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    CERN Document Server

    Öhman, H; The ATLAS collaboration; Hendrix, V

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. With the new cloud technologies come also new challenges, and one such is the contextualization of cloud resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible, which precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration, dynamic resource scaling, and high degree of scalability.

  2. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  3. I-Xe measurements of CAIs and chondrules from the CV3 chondrites Mokoia and Vigarano

    Science.gov (United States)

    Whitby, J. A.; Russell, S. S.; Turner, G.; Gilmour, J. D.

    2004-08-01

    I-Xe analyses were carried out for chondrules and refractory inclusions from the two CV3 carbonaceous chondrites Mokoia and Vigarano (representing the oxidized and reduced subgroups, respectively). Although some degree of disturbance to the I-Xe system is evident in all of the samples, evidence is preserved of aqueous alteration of CAIs in Mokoia 1 Myr later than the I-Xe age of the Shallowater standard and of the alteration of a chondrule (V3) from Vigarano ~0.7 Myr later than Shallowater. Other chondrules in Mokoia and Vigarano experienced disturbance of the I-Xe system millions of years later and, in the case of one Vigarano chondrule (VS1), complete resetting of the I-Xe system after decay of essentially all 129I, corresponding to an age more than 80 Myr after Shallowater. Our interpretation is that accretion and processing to form the Mokoia and Vigarano parent bodies must have continued for at least 4 Myr and 80 Myr, respectively. The late age of a chondrule that shows no evidence for any aqueous alteration or significant thermal processing after its formation leads us to postulate the existence of an energetic chondrule-forming mechanism at a time when nebular processes are not expected to be important.

  4. An Information Theoretic Analysis of Decision in Computer Chess

    CERN Document Server

    Godescu, Alexandru

    2011-01-01

    The basis of the method proposed in this article is the idea that information is one of the most important factors in strategic decisions, including decisions in computer chess and other strategy games. The model proposed in this article and the algorithm described are based on the idea of a information theoretic basis of decision in strategy games . The model generalizes and provides a mathematical justification for one of the most popular search algorithms used in leading computer chess programs, the fractional ply scheme. However, despite its success in leading computer chess applications, until now few has been published about this method. The article creates a fundamental basis for this method in the axioms of information theory, then derives the principles used in programming the search and describes mathematically the form of the coefficients. One of the most important parameters of the fractional ply search is derived from fundamental principles. Until now this coefficient has been usually handcrafted...

  5. WHIPICE. [Computer Program for Analysis of Aircraft Deicing

    Science.gov (United States)

    1992-01-01

    This video documents efforts by NASA Lewis Research Center researchers to improve ice protection for aircraft. A new system of deicing aircraft by allowing a thin sheet of ice to develop, then breaking it into particles, is being examined, particularly to determine the extent of shed ice ingestion by jet engines that results. The process is documented by a high speed imaging system that scans the breakup and flow of the ice particles at 1000 frames per second. This data is then digitized and analyzed using a computer program called WHIPICE, which analyzes grey scale images of the ice particles. Detailed description of the operation of this computer program is provided.

  6. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Preite Martinez, A; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C

    2007-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  7. Impact of Computer-Based Instruction on Attitudes of Students and Instructors: A Review. Final Report.

    Science.gov (United States)

    King, Anne Truscott

    To determine whether contact with computer-assisted instruction (CAI) leads to feelings of "depersonalization" and "dehumanization" a review was conducted of investigations to explore attitudes toward various modes of computer-based instruction before, during, or after exposure. Evaluation of pertinent factors which influenced attitudes was made…

  8. Using Computer-Assisted Instruction to Enhance Achievement of English Language Learners

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2014-01-01

    Computer-assisted instruction (CAI) in English-Language environments offer practice time, motivates students, enhance student learning, increase authentic materials that students can study, and has the potential to encourage teamwork between students. The findings from this particular study suggested that students who used computer assisted…

  9. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  10. A Naturalistic Method for Assessing the Learning of Arithmatic from Computer-Aided Practice.

    Science.gov (United States)

    Hativa, Nira

    1986-01-01

    A study used the naturalistic method of inquiry in order to investigate the CAI (Computer Assisted Instruction) contribution to students' performance in arithmetic and to identify possible problems. The study was aimed at understanding the holistic environment of students' individualized drill in arithmetic with the computer. (BS)

  11. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    OpenAIRE

    Wiedemann, Gregor

    2013-01-01

    Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends convent...

  12. Introduction to Numerical Computation - analysis and Matlab illustrations

    DEFF Research Database (Denmark)

    Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun

    In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...... are illustrated by examples in MATLAB....

  13. Sensitivity analysis of airport noise using computer simulation

    Directory of Open Access Journals (Sweden)

    Flavio Maldonado Bentes

    2011-09-01

    Full Text Available This paper presents the method to analyze the sensitivity of airport noise using computer simulation with the aid of Integrated Noise Model 7.0. The technique serves to support the selection of alternatives to better control aircraft noise, since it helps identify which areas of the noise curves experienced greater variation from changes in aircraft movements at a particular airport.

  14. Comparative Analysis of Palm and Wearable Computers for Participatory Simulations

    Science.gov (United States)

    Klopfer, Eric; Yoon, Susan; Rivas, Luz

    2004-01-01

    Recent educational computer-based technologies have offered promising lines of research that promote social constructivist learning goals, develop skills required to operate in a knowledge-based economy (Roschelle et al. 2000), and enable more authentic science-like problem-solving. In our research programme, we have been interested in combining…

  15. A Knowledge-Based Analysis of Global Function Computation

    CERN Document Server

    Halpern, Joseph Y

    2007-01-01

    Consider a distributed system N in which each agent has an input value and each communication link has a weight. Given a global function, that is, a function f whose value depends on the whole network, the goal is for every agent to eventually compute the value f(N). We call this problem global function computation. Various solutions for instances of this problem, such as Boolean function computation, leader election, (minimum) spanning tree construction, and network determination, have been proposed, each under particular assumptions about what processors know about the system and how this knowledge can be acquired. We give a necessary and sufficient condition for the problem to be solvable that generalizes a number of well-known results. We then provide a knowledge-based (kb) program (like those of Fagin, Halpern, Moses, and Vardi) that solves global function computation whenever possible. Finally, we improve the message overhead inherent in our initial kb program by giving a counterfactual belief-based pro...

  16. Computer-aided analysis of grain growth in metals

    DEFF Research Database (Denmark)

    Klimanek, P.; May, C.; Richter, H.

    1993-01-01

    Isothermal grain growth in aluminium, copper and alpha-iron was investigated experimentally at elevated temperatures and quantitatively interpreted by computer simulation on the base of a statistical model described in [4,5,6]. As it is demonstrated for the grain growth kinetics, the experimental...

  17. DIF Analysis for Pretest Items in Computer-Adaptive Testing.

    Science.gov (United States)

    Zwick, Rebecca; And Others

    A simulation study of methods of assessing differential item functioning (DIF) in computer-adaptive tests (CATs) was conducted by Zwick, Thayer, and Wingersky (in press, 1993). Results showed that modified versions of the Mantel-Haenszel and standardization methods work well with CAT data. DIF methods were also investigated for nonadaptive…

  18. An Analysis of Attitudes toward Computer Networks and Internet Addiction.

    Science.gov (United States)

    Tsai, Chin-Chung; Lin, Sunny S. J.

    The purpose of this study was to explore the interplay between young people's attitudes toward computer networks and Internet addiction. After analyzing questionnaire responses of an initial sample of 615 Taiwanese high school students, 78 subjects, viewed as possible Internet addicts, were selected for further explorations. It was found that…

  19. CAI Course Ware and the Basic Acounting Instruction Innovations%CAI课件与基础会计学教学创新

    Institute of Scientific and Technical Information of China (English)

    苏淑欢

    2002-01-01

    随着计算机技术的飞速发展,CAI(Computer Assisted Instruction)已经成为远距离教学的重要手段,成为一门新兴的研究课题.计算机硬件功能的日益增强,为设计教学课件提供了高速、大容量、低故障、通讯方便的硬件环境.具有强大功能的各种支撑软件平台,为设计高质量的CAI课件提供了理想的设计环境.借助计算机技术,可以仿真地将在课堂里单靠一块黑板一支粉笔难以描述的实际操作再现.CAI课件是会计教学手段的全新模式.本文主要阐述广州市广播电视大学组织开发的基础会计学CAI课件的设计构思和原则,以及通过这一课件的开发得出的几点启发.

  20. Ultrastructural Analysis of Urinary Stones by Microfocus Computed Tomography and Comparison with Chemical Analysis

    Directory of Open Access Journals (Sweden)

    Tolga Karakan

    2016-06-01

    Full Text Available Objective: To investigate the ultra-structure of urinary system stones using micro-focus computed tomography (MCT, which makes non-destructive analysis and to compare with wet chemical analysis. Methods: This study was carried out at the Ankara Train­ing and Research hospital. Renal stones, removed from 30 patients during percutaneous nephrolithotomy (PNL surgery, were included in the study. The stones were blindly evaluated by the specialists with MCT and chemi­cal analysis. Results: The comparison of the stone components be­tween chemical analysis and MCT, showed that the rate of consistence was very low (p0.05. It was also seen that there was no significant relation between its 3D structure being heterogeneous or homogenous. Conclusion: The stone analysis with MCT is a time con­suming and costly method. This method is useful to un­derstand the mechanisms of stone formation and an im­portant guide to develop the future treatment modalities.

  1. Intramural optical mapping of V(m) and Ca(i)2+ during long-duration ventricular fibrillation in canine hearts.

    Science.gov (United States)

    Kong, Wei; Ideker, Raymond E; Fast, Vladimir G

    2012-03-15

    Intramural gradients of intracellular Ca(2+) (Ca(i)(2+)) Ca(i)(2+) handling, Ca(i)(2+) oscillations, and Ca(i)(2+) transient (CaT) alternans may be important in long-duration ventricular fibrillation (LDVF). However, previous studies of Ca(i)(2+) handling have been limited to recordings from the heart surface during short-duration ventricular fibrillation. To examine whether abnormalities of intramural Ca(i)(2+) handling contribute to LDVF, we measured membrane voltage (V(m)) and Ca(i)(2+) during pacing and LDVF in six perfused canine hearts using five eight-fiber optrodes. Measurements were grouped into epicardial, midwall, and endocardial layers. We found that during pacing at 350-ms cycle length, CaT duration was slightly longer (by ≃10%) in endocardial layers than in epicardial layers, whereas action potential duration (APD) exhibited no difference. Rapid pacing at 150-ms cycle length caused alternans in both APD (APD-ALT) and CaT amplitude (CaA-ALT) without significant transmural differences. For 93% of optrode recordings, CaA-ALT was transmurally concordant, whereas APD-ALT was either concordant (36%) or discordant (54%), suggesting that APD-ALT was not caused by CaA-ALT. During LDVF, V(m) and Ca(i)(2+) progressively desynchronized when not every action potential was followed by a CaT. Such desynchronization developed faster in the epicardium than in the other layers. In addition, CaT duration strongly increased (by ∼240% at 5 min of LDVF), whereas APD shortened (by ∼17%). CaT rises always followed V(m) upstrokes during pacing and LDVF. In conclusion, the fact that V(m) upstrokes always preceded CaTs indicates that spontaneous Ca(i)(2+) oscillations in the working myocardium were not likely the reason for LDVF maintenance. Strong V(m)-Ca(i)(2+) desynchronization and the occurrence of long CaTs during LDVF indicate severely impaired Ca(i)(2+) handling and may potentially contribute to LDVF maintenance.

  2. Computational solutions to large-scale data management and analysis.

    Science.gov (United States)

    Schadt, Eric E; Linderman, Michael D; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P

    2010-09-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist - such as cloud and heterogeneous computing - to successfully tackle our big data problems.

  3. Computational design and analysis of flatback airfoil wind tunnel experiment.

    Energy Technology Data Exchange (ETDEWEB)

    Mayda, Edward A. (University of California, Davis, CA); van Dam, C.P. (University of California, Davis, CA); Chao, David D. (University of California, Davis, CA); Berg, Dale E.

    2008-03-01

    A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

  4. Indications for quantum computation requirements from comparative brain analysis

    Science.gov (United States)

    Bernroider, Gustav; Baer, Wolfgang

    2010-04-01

    Whether or not neuronal signal properties can engage 'non-trivial', i.e. functionally significant, quantum properties, is the subject of an ongoing debate. Here we provide evidence that quantum coherence dynamics can play a functional role in ion conduction mechanism with consequences on the shape and associative character of classical membrane signals. In particular, these new perspectives predict that a specific neuronal topology (e.g. the connectivity pattern of cortical columns in the primate brain) is less important and not really required to explain abilities in perception and sensory-motor integration. Instead, this evidence is suggestive for a decisive role of the number and functional segregation of ion channel proteins that can be engaged in a particular neuronal constellation. We provide evidence from comparative brain studies and estimates of computational capacity behind visual flight functions suggestive for a possible role of quantum computation in biological systems.

  5. QUANTUM FOG CLOUD MODEL IN INTERNET OF THINGS WITH ANALYSIS OF GREEN COMPUTING

    OpenAIRE

    Sayantan Gupta

    2017-01-01

    The technology of Quantum Green Computing has been discussed in this paper. It also discusses the need of the many implementation techniques and approaches in relation with Fog-Cloud Computing. Moreover, we would like to introduce the latest algorithms like Stack Algorithm, Address Algorithm and many others which will help in the analysis of Green-Quantum Computing Technology in the modern society and would create a technological revolution. With the Internet of Things rising in the modern wo...

  6. Formal Specification and Analysis of Cloud Computing Management

    Science.gov (United States)

    2012-01-24

    QoS) Management in Service-Oriented Enterprise Architectures. IEEE International Enterprise Distributed Object Computing Conference, 0: 21 –32, 2004...Prof. Dr. Alexander Knapp Betreuer: Prof. Dr. José Meseguer Abgabe: 24. Januar 2012 Hiermit versichere ich, dass ich diese Masterarbeit selbständig...system [75]. Deduction in rewriting logic consists of the concurrent application of the rewriting rules in R modulo the equations in E ∪A. 3.2. The

  7. Analysis of Multilayered Printed Circuit Boards using Computed Tomography

    Science.gov (United States)

    2014-05-01

    resistors harder to identify as Carbon is lower in the periodic table than Aluminium Oxide . Sample 2: Colour Bar Appendix D.2...Aluminium Al2O3 Aluminium Oxide BGA Ball Grid Array Coronal XZ Plane COTS Commercial Off The Shelf C Carbon CT Computed Tomography DC Direct Current...encapsulated in steel housing. Red Circle: Thin film resistors harder to identify as Carbon is lower in the periodic table than Aluminium Oxide . Sample 2

  8. Computational Modeling and Analysis of Mechanically Painful Stimulations

    DEFF Research Database (Denmark)

    Manafi Khanian, Bahram

    to expand the current knowledge on the mechanical influences of cuff algometry on deep-tissue nociceptors. Additionally, this is one of the pioneering projects utilizing the finite element simulation as a computationally reliable method of modelling in pain research field. The present findings are highly...... relevant to biomechanical studies for defining a valid methodology to appropriately activate deep-tissue nociceptors and hence to develop biomedical devices used for pain sensitivity assessment....

  9. Analysis of Computer Experiments with Multiple Noise Sources

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    2010-01-01

    In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled...... effectively with linear mixed effects models and generalized additive models. Copyright (C) 2009 John Wiley & Sons, Ltd....

  10. Computer Simulation of Technetium Scrubbing Section of Purex Ⅰ: Computer Simulation and Technical Parameter Analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yan-xin; HE; Hui; ZHANG; Chun-long; CHANG; Li; LI; Rui-xue; TANG; Hong-bin; YU; Ting

    2012-01-01

    <正>A computer program was developed to simulate technetium scrubbing section (TcS) in Purex based on the theory of cascade extraction. The program can simulate the steady-state behavior of HNO3, U, Pu and Tc in TcS. The reliability of the program was verified by cascade extraction experiment, the relative error between calculation value and experiment value is 10% more or less except few spots. The comparison between experiment and calculation results is illustrated in Fig. 1. The technical parameters of TcS were analyzed by this program, it is found that the Decontamination factor (DFTc/U) in TcS is remarkably affected by the overall consumption (multiply molarity by volume flux) of HNO3, DFTc/U is

  11. Computer-simulated experiments and computer games: a method of design analysis

    Directory of Open Access Journals (Sweden)

    Jerome J. Leary

    1995-12-01

    Full Text Available Through the new modularization of the undergraduate science degree at the University of Brighton, larger numbers of students are choosing to take some science modules which include an amount of laboratory practical work. Indeed, within energy studies, the fuels and combustion module, for which the computer simulations were written, has seen a fourfold increase in student numbers from twelve to around fifty. Fitting out additional laboratories with new equipment to accommodate this increase presented problems: the laboratory space did not exist; fitting out the laboratories with new equipment would involve a relatively large capital spend per student for equipment that would be used infrequently; and, because some of the experiments use inflammable liquids and gases, additional staff would be needed for laboratory supervision.

  12. The Effect of Prior Experience with Computers, Statistical Self-Efficacy, and Computer Anxiety on Students' Achievement in an Introductory Statistics Course: A Partial Least Squares Path Analysis

    Science.gov (United States)

    Abd-El-Fattah, Sabry M.

    2005-01-01

    A Partial Least Squares Path Analysis technique was used to test the effect of students' prior experience with computers, statistical self-efficacy, and computer anxiety on their achievement in an introductory statistics course. Computer Anxiety Rating Scale and Current Statistics Self-Efficacy Scale were administered to a sample of 64 first-year…

  13. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  14. Structural Analysis: Shape Information via Points-To Computation

    CERN Document Server

    Marron, Mark

    2012-01-01

    This paper introduces a new hybrid memory analysis, Structural Analysis, which combines an expressive shape analysis style abstract domain with efficient and simple points-to style transfer functions. Using data from empirical studies on the runtime heap structures and the programmatic idioms used in modern object-oriented languages we construct a heap analysis with the following characteristics: (1) it can express a rich set of structural, shape, and sharing properties which are not provided by a classic points-to analysis and that are useful for optimization and error detection applications (2) it uses efficient, weakly-updating, set-based transfer functions which enable the analysis to be more robust and scalable than a shape analysis and (3) it can be used as the basis for a scalable interprocedural analysis that produces precise results in practice. The analysis has been implemented for .Net bytecode and using this implementation we evaluate both the runtime cost and the precision of the results on a num...

  15. The Use of Video Disks: Computer Based Analysis of Works of Art.

    Science.gov (United States)

    McWhinnie, Harold J.

    This paper presents research using a computer with a video disk player to do aesthetic analysis of the work of Vincent Van Gogh. A discussion of the video disk system, and of several software systems including: (1) Dr. Halo, (2) Handy, (3) PC-Paint, and (4) Pilot are outlined. Several possible uses of the computer with interactive video disks for…

  16. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  17. Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis

    Science.gov (United States)

    Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara

    2014-01-01

    This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…

  18. Proceedings: Workshop on Advanced Mathematics and Computer Science for Power Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1991-08-01

    EPRI's Office of Exploratory Research sponsors a series of workshops that explore how to apply recent advances in mathematics and computer science to the problems of the electric utility industry. In this workshop, participants identified research objectives that may significantly improve the mathematical methods and computer architecture currently used for power system analysis.

  19. A Pilot Meta-Analysis of Computer-Based Scaffolding in STEM Education

    Science.gov (United States)

    Belland, Brian R.; Walker, Andrew E.; Olsen, Megan Whitney; Leary, Heather

    2015-01-01

    This paper employs meta-analysis to determine the influence of computer-based scaffolding characteristics and study and test score quality on cognitive outcomes in science, technology, engineering, and mathematics education at the secondary, college, graduate, and adult levels. Results indicate that (a) computer-based scaffolding positively…

  20. A Meta-Analysis of Effectiveness Studies on Computer Technology-Supported Language Learning

    Science.gov (United States)

    Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.

    2013-01-01

    With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…

  1. Computer Aided Mass Balance Analysis for AC Electric Arc Furnace Steelmaking

    Institute of Scientific and Technical Information of China (English)

    (ü)nal Camdali; Murat Tunc

    2005-01-01

    A mass balance analysis was undertaken for liquid steel production using a computer program specially developed for the AC electric arc furnace at an important alloy steel producer in Turkey. The data obtained by using the computer program were found to be very close to the actual production ones.

  2. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  3. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  4. CAR : A MATLAB Package to Compute Correspondence Analysis with Rotations

    NARCIS (Netherlands)

    Lorenzo-Seva, Urbano; van de Velden, Michel; Kiers, Henk A.L.

    2009-01-01

    Correspondence analysis (CA) is a popular method that can be used to analyse relationships between categorical variables. Like principal component analysis, CA solutions can be rotated both orthogonally and obliquely to simple structure without affecting the total amount of explained inertia. We des

  5. Method for Determining Language Objectives and Criteria. Volume II. Methodological Tools: Computer Analysis, Data Collection Instruments.

    Science.gov (United States)

    1979-05-25

    This volume presents (1) Methods for computer and hand analysis of numerical language performance data (includes examples) (2) samples of interview, observation, and survey instruments used in collecting language data. (Author)

  6. Computational Proteomics: High-throughput Analysis for Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  7. Computer-aided analysis of nonlinear problems in transport phenomena

    Science.gov (United States)

    Brown, R. A.; Scriven, L. E.; Silliman, W. J.

    1980-01-01

    The paper describes algorithms for equilibrium and steady-state problems with coefficients in the expansions derived by the Galerkin weighted residual method and calculated from the resulting sets of nonlinear algebraic equations by the Newton-Raphson method. Initial approximations are obtained from nearby solutions by continuation techniques as parameters are varied. The Newton-Raphson technique is preferred because the Jacobian of the solution is useful for continuation, for analyzing the stability of solutions, for detecting bifurcation of solution families, and for computing asymptotic estimates of the effects on any solution of small changes in parameters, boundary conditions, and boundary shape.

  8. Routing performance analysis and optimization within a massively parallel computer

    Science.gov (United States)

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  9. Computer science research in Malaysia: a bibliometric analysis

    OpenAIRE

    Bakri, A; Willett, P.

    2011-01-01

    Purpose – The purpose of this paper is to analyse the publications of, and the citations to, the current staff of 19 departments of computer science in Malaysian universities, and to compare these bibliometric data with expert peer reviews of Malaysian research performance. \\ud \\ud Design/methodology/approach – This paper searches citation of the Scopus and Web of Science databases. \\ud \\ud Findings – Both publication and citation rates are low, although this is at least in part due to some M...

  10. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  11. Analysis of Computer Science Communities Based on DBLP

    CERN Document Server

    Biryukov, Maria; 10.1007/978-3-642-15464-5_24

    2010-01-01

    It is popular nowadays to bring techniques from bibliometrics and scientometrics into the world of digital libraries to analyze the collaboration patterns and explore mechanisms which underlie community development. In this paper we use the DBLP data to investigate the author's scientific career and provide an in-depth exploration of some of the computer science communities. We compare them in terms of productivity, population stability and collaboration trends.Besides we use these features to compare the sets of topranked conferences with their lower ranked counterparts.

  12. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  13. A Review on Liquid Spray Models for Diesel Engine Computational Analysis

    Science.gov (United States)

    2014-05-01

    used in order to establish hollow cone sprays as is typical for gasoline direct injection engines. These kinds of sprays are characterized by high...A Review on Liquid Spray Models for Diesel Engine Computational Analysis by Luis Bravo and Chol-Bum Kweon ARL-TR-6932 May 2014...Review on Liquid Spray Models for Diesel Engine Computational Analysis Luis Bravo and Chol-Bum Kweon Vehicle Technology Directorate, ARL

  14. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  15. A handheld computer-aided diagnosis system and simulated analysis

    Science.gov (United States)

    Su, Mingjian; Zhang, Xuejun; Liu, Brent; Su, Kening; Louie, Ryan

    2016-03-01

    This paper describes a Computer Aided Diagnosis (CAD) system based on cellphone and distributed cluster. One of the bottlenecks in building a CAD system for clinical practice is the storage and process of mass pathology samples freely among different devices, and normal pattern matching algorithm on large scale image set is very time consuming. Distributed computation on cluster has demonstrated the ability to relieve this bottleneck. We develop a system enabling the user to compare the mass image to a dataset with feature table by sending datasets to Generic Data Handler Module in Hadoop, where the pattern recognition is undertaken for the detection of skin diseases. A single and combination retrieval algorithm to data pipeline base on Map Reduce framework is used in our system in order to make optimal choice between recognition accuracy and system cost. The profile of lesion area is drawn by doctors manually on the screen, and then uploads this pattern to the server. In our evaluation experiment, an accuracy of 75% diagnosis hit rate is obtained by testing 100 patients with skin illness. Our system has the potential help in building a novel medical image dataset by collecting large amounts of gold standard during medical diagnosis. Once the project is online, the participants are free to join and eventually an abundant sample dataset will soon be gathered enough for learning. These results demonstrate our technology is very promising and expected to be used in clinical practice.

  16. Quantitative analysis of cholesteatoma using high resolution computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Shigeru; Yamasoba, Tatsuya (Kameda General Hospital, Chiba (Japan)); Iinuma, Toshitaka

    1992-05-01

    Seventy-three cases of adult cholesteatoma, including 52 cases of pars flaccida type cholesteatoma and 21 of pars tensa type cholesteatoma, were examined using high resolution computed tomography, in both axial (lateral semicircular canal plane) and coronal sections (cochlear, vestibular and antral plane). These cases were classified into two subtypes according to the presence of extension of cholesteatoma into the antrum. Sixty cases of chronic otitis media with central perforation (COM) were also examined as controls. Various locations of the middle ear cavity were measured in terms of size in comparison with pars flaccida type cholesteatoma, pars tensa type cholesteatoma and COM. The width of the attic was significantly larger in both pars flaccida type and pars tensa type cholesteatoma than in COM. With pars flaccida type cholesteatoma there was a significantly larger distance between the malleus and lateral wall of the attic than with COM. In contrast, the distance between the malleus and medial wall of the attic was significantly larger with pars tensa type cholesteatoma than with COM. With cholesteatoma extending into the antrum, regardless of the type of cholesteatoma, there were significantly larger distances than with COM at the following sites: the width and height of the aditus ad antrum, and the width, height and anterior-posterior diameter of the antrum. However, these distances were not significantly different between cholesteatoma without extension into the antrum and COM. The hitherto demonstrated qualitative impressions of bone destruction in cholesteatoma were quantitatively verified in detail using high resolution computed tomography. (author).

  17. An Exploratory Analysis of Computer Mediated Communications on Cyberstalking Severity

    Directory of Open Access Journals (Sweden)

    Stephen D. Barnes

    2007-09-01

    Full Text Available The interaction between disjunctive interpersonal relationships, those where the parties to the relationship disagree on the goals of the relationship, and the use of computer mediated communications channels is a relatively unexplored domain.  Bargh (2002 suggests that CMC channels can amplify the development of interpersonal relationships, and notes that the effect is not constant across communications activities.  This proposal suggests a line of research that explores the interaction between computer mediated communications (CMC and stalking, which is a common form of disjunctive relationships.  Field data from cyberstalking cases will be used to look at the effects of CMC channels on stalking case severity, and exploring the relative impacts of CMC channel characteristics on such cases.  To accomplish this, a ratio scaled measure of stalking case severity is proposed for use in exploring the relationship between case severity and CMC media characteristics, anonymity, and the prior relationship between the stalker and the victim.  Expected results are identified, and follow-up research is proposed. 

  18. Computational aspects of sensitivity calculations in transient structural analysis

    Science.gov (United States)

    Greene, William H.; Haftka, Raphael T.

    1988-01-01

    A key step in the application of formal automated design techniques to structures under transient loading is the calculation of sensitivities of response quantities to the design parameters. This paper considers structures with general forms of damping acted on by general transient loading and addresses issues of computational errors and computational efficiency. The equations of motion are reduced using the traditional basis of vibration modes and then integrated using a highly accurate, explicit integration technique. A critical point constraint formulation is used to place constraints on the magnitude of each response quantity as a function of time. Three different techniques for calculating sensitivities of the critical point constraints are presented. The first two are based on the straightforward application of the forward and central difference operators, respectively. The third is based on explicit differentiation of the equations of motion. Condition errors, finite difference truncation errors, and modal convergence errors for the three techniques are compared by applying them to a simple five-span-beam problem. Sensitivity results are presented for two different transient loading conditions and for both damped and undamped cases.

  19. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  20. Propulsion Test Support Analysis with GPU Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The design, development and staging of tests to certify liquid rocket engines usually require high-fidelity structural, fluid and thermal support analysis. These...

  1. Flexible Launch Vehicle Stability Analysis Using Steady and Unsteady Computational Fluid Dynamics

    Science.gov (United States)

    Bartels, Robert E.

    2012-01-01

    Launch vehicles frequently experience a reduced stability margin through the transonic Mach number range. This reduced stability margin can be caused by the aerodynamic undamping one of the lower-frequency flexible or rigid body modes. Analysis of the behavior of a flexible vehicle is routinely performed with quasi-steady aerodynamic line loads derived from steady rigid aerodynamics. However, a quasi-steady aeroelastic stability analysis can be unconservative at the critical Mach numbers, where experiment or unsteady computational aeroelastic analysis show a reduced or even negative aerodynamic damping.Amethod of enhancing the quasi-steady aeroelastic stability analysis of a launch vehicle with unsteady aerodynamics is developed that uses unsteady computational fluid dynamics to compute the response of selected lower-frequency modes. The response is contained in a time history of the vehicle line loads. A proper orthogonal decomposition of the unsteady aerodynamic line-load response is used to reduce the scale of data volume and system identification is used to derive the aerodynamic stiffness, damping, and mass matrices. The results are compared with the damping and frequency computed from unsteady computational aeroelasticity and from a quasi-steady analysis. The results show that incorporating unsteady aerodynamics in this way brings the enhanced quasi-steady aeroelastic stability analysis into close agreement with the unsteady computational aeroelastic results.

  2. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  3. Linear static structural and vibration analysis on high-performance computers

    Science.gov (United States)

    Baddourah, M. A.; Storaasli, O. O.; Bostic, S. W.

    1993-01-01

    Parallel computers offer the oppurtunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on massively-parallel computers hereafter referred to as Scalable High-Performance Computers (SHPC), for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. models for High-Speed Civil Transport). The goal of this research is to develop a new, efficient technique which extends structural analysis to SHPC and makes large-scale structural analyses tractable.

  4. Computational analysis of the flow field downstream of flow conditioners

    Energy Technology Data Exchange (ETDEWEB)

    Erdal, Asbjoern

    1997-12-31

    Technological innovations are essential for maintaining the competitiveness for the gas companies and here metering technology is one important area. This thesis shows that computational fluid dynamic techniques can be a valuable tool for examination of several parameters that may affect the performance of a flow conditioner (FC). Previous design methods, such as screen theory, could not provide fundamental understanding of how a FC works. The thesis shows, among other things, that the flow pattern through a complex geometry, like a 19-hole plate FC, can be simulated with good accuracy by a k-{epsilon} turbulence model. The calculations illuminate how variations in pressure drop, overall porosity, grading of porosity across the cross-section and the number of holes affects the performance of FCs. These questions have been studied experimentally by researchers for a long time. Now an understanding of the important mechanisms behind efficient FCs emerges from the predictions. 179 ref., 110 figs., 8 tabs.

  5. Analysis of Craniofacial Images using Computational Atlases and Deformation Fields

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur

    2008-01-01

    the craniofacial morphology and asymmetry of Crouzon mice. Moreover, a method to plan and evaluate treatment of children with deformational plagiocephaly, based on asymmetry assessment, is established. Finally, asymmetry in children with unicoronal synostosis is automatically assessed, confirming previous results...... purposes. The basis for most of the applications is non-rigid image registration. This approach brings one image into the coordinate system of another resulting in a deformation field describing the anatomical correspondence between the two images. A computational atlas representing the average anatomy...... of a group may be constructed and brought into correspondence with a set of images of interest. Having established such a correspondence, various analyses may be carried out. This thesis discusses two types of such analyses, i.e. statistical deformation models and novel approaches for the quantification...

  6. Computer analysis of sodium cold trap design and performance. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    McPheeters, C.C.; Raue, D.J.

    1983-11-01

    Normal steam-side corrosion of steam-generator tubes in Liquid Metal Fast Breeder Reactors (LMFBRs) results in liberation of hydrogen, and most of this hydrogen diffuses through the tubes into the heat-transfer sodium and must be removed by the purification system. Cold traps are normally used to purify sodium, and they operate by cooling the sodium to temperatures near the melting point, where soluble impurities including hydrogen and oxygen precipitate as NaH and Na/sub 2/O, respectively. A computer model was developed to simulate the processes that occur in sodium cold traps. The Model for Analyzing Sodium Cold Traps (MASCOT) simulates any desired configuration of mesh arrangements and dimensions and calculates pressure drops and flow distributions, temperature profiles, impurity concentration profiles, and impurity mass distributions.

  7. A computational method for recording and analysis of mandibular movements

    Directory of Open Access Journals (Sweden)

    Alan Petrônio Pinheiro

    2008-10-01

    Full Text Available This study proposed the development of a new clinical tool capable of quantifying the movements of opening-closing, protrusion and laterotrusion of the mandible. These movements are important for the clinical evaluation of the temporomandibular function and muscles involved in mastication. Unlike current commercial systems, the proposed system employs a low-cost video camera and a computer program that is used for reconstructing the trajectory of a reflective marker that is fixed on the mandible. In order to illustrate the clinical application of this tool, a clinical experiment consisting on the evaluation of the mandibular movements of 12 subjects was conducted. The results of this study were compatible with those found in the literature with the advantage of using a low cost, simple, non-invasive, and flexible tool customized for the needs of the practical clinic.

  8. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  9. New numerical analysis method in computational mechanics: composite element method

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A new type of FEM, called CEM (composite element method), is proposed to solve the static and dynamic problems of engineering structures with high accuracy and efficiency. The core of this method is to define two sets of coordinate systems for DOF's description after discretizing the structure, i.e. the nodal coordinate system UFEM(ξ) for employing the conventional FEM, and the field coordinate system UCT(ξ) for utilizing classical theory. Then, coupling these two sets of functional expressions could obtain the composite displacement field U(ξ) of CEM. The computations of the stiffness and mass matrices can follow the conventional procedure of FEM. Since the CEM inherents some good properties of the conventional FEM and classical analytical method, it has the powerful versatility to various complex geometric shapes and excellent approximation. Many examples are presented to demonstrate the ability of CEM.

  10. Bayesian Analysis of Multiple Populations I: Statistical and Computational Methods

    CERN Document Server

    Stenning, D C; Robinson, E; van Dyk, D A; von Hippel, T; Sarajedini, A; Stein, N

    2016-01-01

    We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations (vanDyk et al. 2009, Stein et al. 2013). Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties---age, metallicity, helium abundance, distance, absorption, and initial mass---are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and al...

  11. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  12. New numerical analysis method in computational mechanics: composite element method

    Institute of Scientific and Technical Information of China (English)

    曾攀

    2000-01-01

    A new type of FEM, called CEM (composite element method), is proposed to solve the static and dynamic problems of engineering structures with high accuracy and efficiency. The core of this method is to define two sets of coordinate systems for DOF’ s description after discretizing the structure, i.e. the nodal coordinate system UFEM(ζ) for employing the conventional FEM, and the field coordinate system UCT(ζ) for utilizing classical theory. Then, coupling these two sets of functional expressions could obtain the composite displacement field U(ζ) of CEM. The computations of the stiffness and mass matrices can follow the conventional procedure of FEM. Since the CEM inherents some good properties of the conventional FEM and classical analytical method, it has the powerful versatility to various complex geometric shapes and excellent approximation. Many examples are presented to demonstrate the ability of CEM.

  13. COMPUTATIONAL ANALYSIS OF PARTICULATE FLOW IN EXPANSION CHANNEL

    Directory of Open Access Journals (Sweden)

    Nor Azwadi Che Sidik

    2013-01-01

    Full Text Available Computational prediction of fluid-solid particle interaction in an expansion horizontal channel with wide range of Reynolds numbers. Lagrangian-Lagrangian numerical technique to predict the movement of solid particle. The method is based on mesocale scheme of lattice Boltzmann method for prediction of fluid dynamics and second Newton’s law for the dynamics of solid particles. The flow behaviour at the downstream of the expansion channel is critically dependence on the Reynolds number of the flow. The removal percentage of the contaminant critically dependence on the flow structure donwstream of the expansion channel. The strength of recirculation region plays significant role due to the step in the cavity.

  14. Multiscale and multimodality computed tomography for cortical bone analysis

    Science.gov (United States)

    Ostertag, A.; Peyrin, F.; Gouttenoire, P. J.; Laredo, J. D.; DeVernejoul, M. C.; Cohen Solal, M.; Chappard, C.

    2016-12-01

    In clinical studies, high resolution peripheral quantitative computed tomography (HR-pQCT) is used to separately evaluate cortical bone and trabecular bone with an isotropic voxel of 82 µm3, and typical cortical parameters are cortical density (D.comp), thickness (Ct.Th), and porosity (Ct.Po). In vitro, micro-computed tomography (micro-CT) is used to explore the internal cortical bone micro-structure with isotropic voxels and high resolution synchrotron radiation (SR); micro-CT is considered the ‘gold standard’. In 16 tibias and 8 femurs, HR-pQCT measurements were compared to conventional micro-CT measurements. To test modality effects, conventional micro-CT measurements were compared to SR micro-CT measurements at 7.5 µm3 SR micro-CT measurements were also tested at different voxel sizes for the femurs, specifically, 7.5 µm3 versus 2.8 µm3. D.comp (r  =  -0.88, p  images provided consistent results compared to those obtained using conventional micro-CT at the distal tibia. D.comp was highly correlated to Po.V/TV because it considers both the micro-porosity (Haversian systems) and macro-porosity (resorption lacunae) of cortical bone. The complexity of canal organization, (including shape, connectivity, and surface) are not fully considered in conventional micro-CT in relation to beam hardening and cone beam reconstruction artifacts. With the exception of Po.V/TV measurements, morphological and topological measurements depend on the characteristics of the x-ray beam, and to a lesser extent, on image resolution.

  15. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  16. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  17. Computational Analysis of Coriant and PNNL Radioxenon Data Viewers

    Energy Technology Data Exchange (ETDEWEB)

    McIntyre, Justin I.; Carman, April J.

    2005-10-05

    The analysis by Coriant of the beta-gamma coincidence data coming from the ARSA systems show a systematic basis towards lower concentrations for all isotopes and a systematic increase in the minimum detectable concentrations. These variations can be directly traced to the method of analysis that is used by the Coriant software compared to the methods that have been developed by the International Noble Gas Experiment collaboration. This report details the differences and suggests solutions where appropriate. The report writers recommend that the algorithm changes be made to the Coriant software to bring up to the international standards.

  18. Probabilistic structural analysis algorithm development for computational efficiency

    Science.gov (United States)

    Wu, Y.-T.

    1991-01-01

    The PSAM (Probabilistic Structural Analysis Methods) program is developing a probabilistic structural risk assessment capability for the SSME components. An advanced probabilistic structural analysis software system, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), is being developed as part of the PSAM effort to accurately simulate stochastic structures operating under severe random loading conditions. One of the challenges in developing the NESSUS system is the development of the probabilistic algorithms that provide both efficiency and accuracy. The main probability algorithms developed and implemented in the NESSUS system are efficient, but approximate in nature. In the last six years, the algorithms have improved very significantly.

  19. Computation and Asymptotic Analysis in the Impact Problem

    Institute of Scientific and Technical Information of China (English)

    Lei Hou; Lin Qiu

    2009-01-01

    Non-linear numerical method is applied to solve the viscons-elastic-plastic material impact problem.The finite element simulation agrees with the celebrated European crash safety analysis.The complex material stress distribution in the large deformation has been obtained,when the impact happens.Also the posterior-estimate solver and asymptotic analysis have been used for the sensitive pre-stage deformation before the impact happening.This part of simulation is very interesting for the passive safety in automotive protection devices.It is an important part of the mathematical modelling.

  20. Biology Teacher and Expert Opinions about Computer Assisted Biology Instruction Materials: A Software Entitled Nucleic Acids and Protein Synthesis

    Science.gov (United States)

    Hasenekoglu, Ismet; Timucin, Melih

    2007-01-01

    The aim of this study is to collect and evaluate opinions of CAI experts and biology teachers about a high school level Computer Assisted Biology Instruction Material presenting computer-made modelling and simulations. It is a case study. A material covering "Nucleic Acids and Protein Synthesis" topic was developed as the…

  1. Distributed Parallel Computing in Data Analysis of Osteoporosis.

    Science.gov (United States)

    Waleska Simões, Priscyla; Venson, Ramon; Comunello, Eros; Casagrande, Rogério Antônio; Bigaton, Everson; da Silva Carlessi, Lucas; da Rosa, Maria Inês; Martins, Paulo João

    2015-01-01

    This research aimed to compare the performance of two models of load balancing (Proportional and Autotuned algorithms) of the JPPF platform in the processing of data mining from a database with osteoporosis and osteopenia. When performing the analysis of execution times, it was observed that the Proportional algorithm performed better in all cases.

  2. Introduction to Statistics and Data Analysis With Computer Applications I.

    Science.gov (United States)

    Morris, Carl; Rolph, John

    This document consists of unrevised lecture notes for the first half of a 20-week in-house graduate course at Rand Corporation. The chapter headings are: (1) Histograms and descriptive statistics; (2) Measures of dispersion, distance and goodness of fit; (3) Using JOSS for data analysis; (4) Binomial distribution and normal approximation; (5)…

  3. Computational Fluid Dynamics Analysis of Flexible Duct Junction Box Design

    Energy Technology Data Exchange (ETDEWEB)

    Beach, R.; Prahl, D.; Lange, R.

    2013-12-01

    IBACOS explored the relationships between pressure and physical configurations of flexible duct junction boxes by using computational fluid dynamics (CFD) simulations to predict individual box parameters and total system pressure, thereby ensuring improved HVAC performance. Current Air Conditioning Contractors of America (ACCA) guidance (Group 11, Appendix 3, ACCA Manual D, Rutkowski 2009) allows for unconstrained variation in the number of takeoffs, box sizes, and takeoff locations. The only variables currently used in selecting an equivalent length (EL) are velocity of air in the duct and friction rate, given the first takeoff is located at least twice its diameter away from the inlet. This condition does not account for other factors impacting pressure loss across these types of fittings. For each simulation, the IBACOS team converted pressure loss within a box to an EL to compare variation in ACCA Manual D guidance to the simulated variation. IBACOS chose cases to represent flows reasonably correlating to flows typically encountered in the field and analyzed differences in total pressure due to increases in number and location of takeoffs, box dimensions, and velocity of air, and whether an entrance fitting is included. The team also calculated additional balancing losses for all cases due to discrepancies between intended outlet flows and natural flow splits created by the fitting. In certain asymmetrical cases, the balancing losses were significantly higher than symmetrical cases where the natural splits were close to the targets. Thus, IBACOS has shown additional design constraints that can ensure better system performance.

  4. [Recurrence paralysis: computed tomographic analysis of intrathoracic findings].

    Science.gov (United States)

    Delorme, S; Knopp, M V; Kauczor, H U; Zuna, I; Trost, U; Haberkorn, U; van Kaick, G

    1992-09-01

    The long and singular course of the inferior (recurrent) laryngeal nerve makes it very vulnerable to infiltration by tumors of various locations. In particular, mediastinal and pulmonary lesions must be considered in the case of left vocal chord palsy. Recurrent nerve paralysis caused by a tumor indicates advanced disease. We retrospectively reviewed the computed tomography (CT) findings in 29 patients with bronchogenic carcinoma or mediastinal tumors and recurrent nerve paralysis with respect to the site, size and extent of the tumor and the lymph node status. The review revealed a marked predominance of left upper lobe tumors with extensive lymph node metastases to the anterior mediastinum and the aortopulmonary window. The extent of mediastinal involvement exceeded the average involvement in a control group of 30 randomly selected patients with bronchogenic carcinoma at the time of presentation. In all patients CT demonstrated tumor tissue which could have caused the paralysis at one or more sites along the anatomical course of the recurrent nerve. In most cases the tumor was located at the aortic arch. The left paratracheal region, right paratracheal region and right pulmonary apex were affected in one case each. We conclude that in patients with cancer, CT is a suitable method for localizing a recurrent nerve lesion.

  5. Computational Fluid Dynamic Analysis of a Vibrating Turbine Blade

    Directory of Open Access Journals (Sweden)

    Osama N. Alshroof

    2012-01-01

    Full Text Available This study presents the numerical fluid-structure interaction (FSI modelling of a vibrating turbine blade using the commercial software ANSYS-12.1. The study has two major aims: (i discussion of the current state of the art of modelling FSI in gas turbine engines and (ii development of a “tuned” one-way FSI model of a vibrating turbine blade to investigate the correlation between the pressure at the turbine casing surface and the vibrating blade motion. Firstly, the feasibility of the complete FSI coupled two-way, three-dimensional modelling of a turbine blade undergoing vibration using current commercial software is discussed. Various modelling simplifications, which reduce the full coupling between the fluid and structural domains, are then presented. The one-way FSI model of the vibrating turbine blade is introduced, which has the computational efficiency of a moving boundary CFD model. This one-way FSI model includes the corrected motion of the vibrating turbine blade under given engine flow conditions. This one-way FSI model is used to interrogate the pressure around a vibrating gas turbine blade. The results obtained show that the pressure distribution at the casing surface does not differ significantly, in its general form, from the pressure at the vibrating rotor blade tip.

  6. Analysis of secondary coxarthrosis by three dimensional computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hemmi, Osamu [Keio Univ., Tokyo (Japan). School of Medicine

    1997-11-01

    The majority of coxarthrosis in Japan is due to congenital dislocation of the hip and acetabular dysplasia. Until now coxarthrosis has been chiefly analyzed on the basis of anterior-posterior radiographs. By using three-dimensional (3D) CT, it was possible to analyze the morphological features of secondary coxarthrosis more accurately, and by using new computer graphics software, it was possible to display the contact area in the hip joint and observe changes associated with progression of the stages of the disease. There were 34 subjects (68 joints), and all of who were women. The CT data were read into a work station, and 3D reconstruction was achieved with hip surgery simulation software (SurgiPlan). Pelvic inclination, acetabular anteversion, seven parameters indicating the investment of the femoral head and two indicating the position of the hip joint in the pelvis were measured. The results showed that secondary coxarthrosis is characterized not only by lateral malposition of the hip joint according to the pelvic coordinates, but by anterior malposition as well. Many other measurements provided 3D information on the acetabular dysplasia. Many of them were correlated with the CE angle on plain radiographs. Furthermore, a strong correlation was not found between anterior and posterior acetabular coverage of the femoral head. In addition, SurgiPlan`s distance mapping function enabled 3D observation of the pattern of progression of arthrosis based on the pattern of progression of joint space narrowing. (author)

  7. Learning to Translate: A Statistical and Computational Analysis

    Directory of Open Access Journals (Sweden)

    Marco Turchi

    2012-01-01

    Full Text Available We present an extensive experimental study of Phrase-based Statistical Machine Translation, from the point of view of its learning capabilities. Very accurate Learning Curves are obtained, using high-performance computing, and extrapolations of the projected performance of the system under different conditions are provided. Our experiments confirm existing and mostly unpublished beliefs about the learning capabilities of statistical machine translation systems. We also provide insight into the way statistical machine translation learns from data, including the respective influence of translation and language models, the impact of phrase length on performance, and various unlearning and perturbation analyses. Our results support and illustrate the fact that performance improves by a constant amount for each doubling of the data, across different language pairs, and different systems. This fundamental limitation seems to be a direct consequence of Zipf law governing textual data. Although the rate of improvement may depend on both the data and the estimation method, it is unlikely that the general shape of the learning curve will change without major changes in the modeling and inference phases. Possible research directions that address this issue include the integration of linguistic rules or the development of active learning procedures.

  8. Emergence of Anisotropy in Flock Simulations and Its Computational Analysis

    Science.gov (United States)

    Makiguchi, Motohiro; Inoue, Jun-Ichi

    2010-03-01

    In real flocks, it was revealed that the angular density of nearest neighbors shows a strong anisotropic structure of individuals by very recent extensive field studies [Ballerini et al, Proceedings of the National Academy of Sciences USA 105, pp. 1232-1237 (2008)]. In this paper, we show that this structure of anisotropy also emerges in an artificial flock simulation, namely, Boid simulation by Reynolds [C.W. Reynolds, Flocks, Herds, and Schools: A Distributed Behavioral Model, Computer Graphics, 21, pp. 25-34 (1987)]. To quantify the anisotropy, we evaluate a useful statistics, that is to say, the so-called γ-value which is defined as an inner product between the vector in the direction of the lowest angular density of flocks and the vector in the direction of the flock is moving. Our results concerning the emergence of the anisotropy through the γ-value might enable us to judge whether an optimal flock simulation seems to be realistic or not.

  9. Computational Fluid Dynamics Analysis of Canadian Supercritical Water Reactor (SCWR)

    Science.gov (United States)

    Movassat, Mohammad; Bailey, Joanne; Yetisir, Metin

    2015-11-01

    A Computational Fluid Dynamics (CFD) simulation was performed on the proposed design for the Canadian SuperCritical Water Reactor (SCWR). The proposed Canadian SCWR is a 1200 MW(e) supercritical light-water cooled nuclear reactor with pressurized fuel channels. The reactor concept uses an inlet plenum that all fuel channels are attached to and an outlet header nested inside the inlet plenum. The coolant enters the inlet plenum at 350 C and exits the outlet header at 625 C. The operating pressure is approximately 26 MPa. The high pressure and high temperature outlet conditions result in a higher electric conversion efficiency as compared to existing light water reactors. In this work, CFD simulations were performed to model fluid flow and heat transfer in the inlet plenum, outlet header, and various parts of the fuel assembly. The ANSYS Fluent solver was used for simulations. Results showed that mass flow rate distribution in fuel channels varies radially and the inner channels achieve higher outlet temperatures. At the outlet header, zones with rotational flow were formed as the fluid from 336 fuel channels merged. Results also suggested that insulation of the outlet header should be considered to reduce the thermal stresses caused by the large temperature gradients.

  10. A Grounded Theory Analysis of Introductory Computer Science Pedagogy

    Directory of Open Access Journals (Sweden)

    Jonathan Wellons

    2011-12-01

    Full Text Available Planning is a critical, early step on the path to successful program writing and a skill that is often lacking in novice programmers. As practitioners we are continually searching for or creating interventions to help our students, particularly those who struggle in the early stages of their computer science education. In this paper we report on our ongoing research of novice programming skills that utilizes the qualitative research method of grounded theory to develop theories and inform the construction of these interventions. We describe how grounded theory, a popular research method in the social sciences since the 1960’s, can lend formality and structure to the common practice of simply asking students what they did and why they did it. Further, we aim to inform the reader not only about our emerging theories on interventions for planning but also how they might collect and analyze their own data in this and other areas that trouble novice programmers. In this way those who lecture and design CS1 interventions can do so from a more informed perspective.

  11. A computational model for dynamic analysis of the human gait.

    Science.gov (United States)

    Vimieiro, Claysson; Andrada, Emanuel; Witte, Hartmut; Pinotti, Marcos

    2015-01-01

    Biomechanical models are important tools in the study of human motion. This work proposes a computational model to analyse the dynamics of lower limb motion using a kinematic chain to represent the body segments and rotational joints linked by viscoelastic elements. The model uses anthropometric parameters, ground reaction forces and joint Cardan angles from subjects to analyse lower limb motion during the gait. The model allows evaluating these data in each body plane. Six healthy subjects walked on a treadmill to record the kinematic and kinetic data. In addition, anthropometric parameters were recorded to construct the model. The viscoelastic parameter values were fitted for the model joints (hip, knee and ankle). The proposed model demonstrated that manipulating the viscoelastic parameters between the body segments could fit the amplitudes and frequencies of motion. The data collected in this work have viscoelastic parameter values that follow a normal distribution, indicating that these values are directly related to the gait pattern. To validate the model, we used the values of the joint angles to perform a comparison between the model results and previously published data. The model results show a same pattern and range of values found in the literature for the human gait motion.

  12. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  13. Computational Analysis of Perfect-Information Position Auctions

    OpenAIRE

    Thompson, David R. M; Leyton-Brown, Kevin

    2014-01-01

    After experimentation with other designs, the major search engines converged on the weighted, generalized second-price auction (wGSP) for selling keyword advertisements. Notably, this convergence occurred before position auctions were well understood (or, indeed, widely studied) theoretically. While much progress has been made since, theoretical analysis is still not able to settle the question of why search engines found wGSP preferable to other position auctions. We approach this question i...

  14. COMPARATIVE ANALYSIS OF COMPUTER SOFTWARE AND BRAILLE LITERACY TO EDUCATE STUDENTS HAVING VISUAL IMPAIRMENT

    Directory of Open Access Journals (Sweden)

    Ismat Bano

    2011-10-01

    Full Text Available This research investigates the comparative analysis of computer software and Braille literacy to educate students having visual impairment. The main objective of this research focus on compare the feasibility andusage of Braille literacy and computer software to educate children with visual impairment. Main objectives of the study were to identify the importance of Braille and Computer literacy by the perceptions of male and female students with visual impairment, to identify the importance of the Braille and Computer literacy in different classes of students with visual impairment and to identify the difference of Braille and Computer literacy importance in different schools of students with visual impairment. Five special education institutions were selected where students with visual impairment were studying. A convenient sample of 100 students was taken from these schools. A three point rating scale was used as research instrument. Researchers personally collected data from the respondents. Data was analyzed through SPSS. Major findings showed that students were more interested in Braille system than computer software. Braille system and required material was resent in all the schools while computer teachers with required experience were not available in these institutions. Teachers were found expert in Braille literacy as compare to the computer software- It was recommended that proper awareness about most recent technologies were necessary for teachers in special education institutions. Students as well as teachers should be provided chances of hands on practice to create interest in computer software use in special education.

  15. Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.

    Science.gov (United States)

    Gray, Peter J.

    This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…

  16. Climate Change Discourse in Mass Media: Application of Computer-Assisted Content Analysis

    Science.gov (United States)

    Kirilenko, Andrei P.; Stepchenkova, Svetlana O.

    2012-01-01

    Content analysis of mass media publications has become a major scientific method used to analyze public discourse on climate change. We propose a computer-assisted content analysis method to extract prevalent themes and analyze discourse changes over an extended period in an objective and quantifiable manner. The method includes the following: (1)…

  17. State-variable analysis of non-linear circuits with a desk computer

    Science.gov (United States)

    Cohen, E.

    1981-01-01

    State variable analysis was used to analyze the transient performance of non-linear circuits on a desk top computer. The non-linearities considered were not restricted to any circuit element. All that is required for analysis is the relationship defining each non-linearity be known in terms of points on a curve.

  18. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  19. Electronic Commerce and Developing Countries: a Computable General Equilibrium Analysis

    Directory of Open Access Journals (Sweden)

    Juan Pizarro Ríos

    2002-06-01

    Full Text Available Es ampliamente reconocido que el comercio electrónico reduce costos de transacción,incrementa la eficiencia y produce importantes cambios en la administración ylos procesos productivos de los negocios. Asimismo, en el ámbito macroeconómico,un creciente número de economistas reconocen que el comercio electrónicoBusiness-to-Business puede tener un impacto positivo en la productividad y el crecimientode los paises desarrollados. Este articulo hace un análisis cuantitativo delimpacto del comercio electrónico sobre la economía global cuando las economías endesarrollo se atrasan tecnológicamente y cuando alcanzan a los países desarrollados.El análisis se centra en la reducción de costos y asume que el comercio electrónicopuede reducir costos de servicios, particularmente, en el comercio al por mayory por menor, transporte, así como en el sector financiero. Los experimentos se basanen un modelo computable de equilibrio general, ei GTAP, de trece sectores y seisregiones. Las reducciones de costos en el sector servicios son simuladas por un crecimientode la productividad. A excepción de los servicios de transporte acuático, losresultados en general revelan que cuando los países en desarrollo se atrasan tecnológicamente,la brecha entre el ingreso de los paises en desarrollo y los países desarrolladosse incrementará. Los países en desarrollo perderán bienestar y verán deterioradossus términos de intercambio y reducidos sus salarios. Los resultadostambién indican que una convergencia en la productividad del sector servicios ofrecela posibilidad a los países en desarrollo de incrementar su competitividad e incrementarla producción, los salarios y el bienestar.

  20. Material flow analysis of used personal computers in Japan.

    Science.gov (United States)

    Yoshida, Aya; Tasaki, Tomohiro; Terazono, Atsushi

    2009-05-01

    Most personal computers (PCs) are discarded by consumers after the data files have been moved to a new PC. Therefore, a used PC collection scheme should be created that does not depend on the distribution route of new PCs. In Japan, manufacturers' voluntary take-back recycling schemes were established in 2001 (for business PCs) and 2003 (for household PCs). At the same time, the export of used PCs from Japan increased, affecting the domestic PC reuse market. These regulatory and economic conditions would have changed the flow of used PCs. In this paper, we developed a method of minimizing the errors in estimating the material flow of used PCs. The method's features include utilization of both input and output flow data and elimination of subjective estimation as much as possible. Flow rate data from existing surveys were used for estimating the flow of used PCs in Japan for fiscal years (FY) 2000, 2001, and 2004. The results show that 3.92 million and 4.88 million used PCs were discarded in FY 2000 and 2001, respectively. Approximately two-thirds of the discarded PCs were disposed of or recycled within the country, one-fourth was reused within the country, and 8% were exported. In FY 2004, 7.47 million used PCs were discarded. The ratio of domestic disposal and recycling decreased to 37% in FY 2004, whereas the domestic reuse and export ratios increased to 37% and 26%, respectively. Flows from businesses to retailers in FY 2004 increased dramatically, which led to increased domestic reuse. An increase in the flow of used PCs from lease and rental companies to secondhand shops has led to increased exports. Results of interviews with members of PC reuse companies were and trade statistics were used to verify the results of our estimation of domestic reuse and export of used PCs.