WorldWideScience

Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Energy Technology Data Exchange (ETDEWEB)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Generative Computer Assisted Instruction: An Application of Artificial Intelligence to CAI.

    Science.gov (United States)

    Koffman, Elliot B.

    Frame-oriented computer-assisted instruction (CAI) systems dominate the field, but these mechanized programed texts utilize the computational power of the computer to a minimal degree and are difficult to modify. Newer, generative CAI systems which are supplied with a knowledge of subject matter can generate their own problems and solutions, can…

  3. Effects of Computer Assisted Instruction (CAI) on Secondary School Students' Performance in Biology

    Science.gov (United States)

    Yusuf, Mudasiru Olalere; Afolabi, Adedeji Olufemi

    2010-01-01

    This study investigated the effects of computer assisted instruction (CAI) on secondary school students' performance in biology. Also, the influence of gender on the performance of students exposed to CAI in individualised or cooperative learning settings package was examined. The research was a quasi experimental involving a 3 x 2 factorial…

  4. Rancangan Perangkat Lunak Computer Assisted Instruction (CAI Untuk Ilmu Tajwid Berbasis Web

    Directory of Open Access Journals (Sweden)

    Fenny Purwani

    2015-08-01

    Full Text Available The development of information technology and science refer to the need of teching-learning concept and mechanism wich are based on information technology, undoubtedly. Regarding the development, it needs qualified human resources and flexible material changing and it should be appropriate with technology and science development. Additionaly, this combines between education based on religious and techology (IMTAK and IPTEK. Internet techology can be used as teaching tool which is known as Computer Assisted Intruction (CAI. CAI software might be one of media or tool in learnig tajwid and it can help people to learn Tajwid easier.

  5. Learning Priniciples Essential for effective computer assisted instruction (cai)

    OpenAIRE

    Havlicek, Larry L.; Ghandour, Mahmoud M.

    1990-01-01

    Guidelines and recommendations for effective computer assisted instruction are presented based on a review of the current theories and research relating to cognitive conceptions of learning and instructional design which are documented by meta-analytic techniques. The main focus is on how meta-cognitive processes are conceptualized and integrated for the most effective development of any interactive technology for learning. These concepts are then integrated into sequencing and synthesizin...

  6. OE CAI: COMPUTER-ASSISTED INSTRUCTION OF OLD ENGLISH

    Directory of Open Access Journals (Sweden)

    Alejandro Alcaraz Sintes

    2002-06-01

    Full Text Available This article offer a general but thorougli survey of Computer Assisted lnstruction as applied to the Old English language íkoni the work of the late 80's pioneers to December 2001. It enibraces all the different facets of the question: stand-alone and web-based applications, Internet sites. CD-ROMs, grammars, dictioriaries, general courses, reading software, extralinguistic material, exercises, handouts, audio files ... Each instruction itee whether it be a website, a java exercise, an online course or an electronic book- is reviewed and URLs are provided in Sootiiotes. These reviews are accompanied all throughout by the pertinent theoretical background and practical advice.

  7. Skinner and CAI.

    Science.gov (United States)

    Chandler, Harry N.

    1984-01-01

    The author cites comments of B.F. Skinner supporting the benefits of carefully constructed computer assisted instruction (CAI) programs. Preliminary studies on military populations suggesting the value of CAI are discussed, as is the collection of information about software. (CL)

  8. Hypertext and three-dimensional computer graphics in an all digital PC-based CAI workstation.

    Science.gov (United States)

    Schwarz, D. L.; Wind, G. G.

    1991-01-01

    In the past several years there has been an enormous increase in the number of computer-assisted instructional (CAI) applications. Many medical educators and physicians have recognized the power and utility of hypertext. Some developers have incorporated simple diagrams, scanned monochrome graphics or still frame photographs from a laser disc or CD-ROM into their hypertext applications. These technologies have greatly increased the role of the microcomputer in education and training. There still remain numerous applications for these tools which are yet to be explored. One of these exciting areas involves the use of three-dimensional computer graphics. An all digital platform increases application portability. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:1807767

  9. Personality preference influences medical student use of specific computer-aided instruction (CAI

    Directory of Open Access Journals (Sweden)

    Halsey Martha

    2006-02-01

    Full Text Available Abstract Background The objective of this study was to test the hypothesis that personality preference, which can be related to learning style, influences individual utilization of CAI applications developed specifically for the undergraduate medical curriculum. Methods Personality preferences of students were obtained using the Myers-Briggs Type Indicator (MBTI test. CAI utilization for individual students was collected from entry logs for two different web-based applications (a discussion forum and a tutorial used in the basic science course on human anatomy. Individual login data were sorted by personality preference and the data statistically analyzed by 2-way mixed ANOVA and correlation. Results There was a wide discrepancy in the level and pattern of student use of both CAI. Although individual use of both CAI was positively correlated irrespective of MBTI preference, students with a "Sensing" preference tended to use both CAI applications more than the "iNtuitives". Differences in the level of use of these CAI applications (i.e., higher use of discussion forum vs. a tutorial were also found for the "Perceiving/Judging" dimension. Conclusion We conclude that personality/learning preferences of individual students influence their use of CAI in the medical curriculum.

  10. Personality preference influences medical student use of specific computer-aided instruction (CAI).

    Science.gov (United States)

    McNulty, John A; Espiritu, Baltazar; Halsey, Martha; Mendez, Michelle

    2006-02-01

    The objective of this study was to test the hypothesis that personality preference, which can be related to learning style, influences individual utilization of CAI applications developed specifically for the undergraduate medical curriculum. Personality preferences of students were obtained using the Myers-Briggs Type Indicator (MBTI) test. CAI utilization for individual students was collected from entry logs for two different web-based applications (a discussion forum and a tutorial) used in the basic science course on human anatomy. Individual login data were sorted by personality preference and the data statistically analyzed by 2-way mixed ANOVA and correlation. There was a wide discrepancy in the level and pattern of student use of both CAI. Although individual use of both CAI was positively correlated irrespective of MBTI preference, students with a "Sensing" preference tended to use both CAI applications more than the "iNtuitives". Differences in the level of use of these CAI applications (i.e., higher use of discussion forum vs. a tutorial) were also found for the "Perceiving/Judging" dimension. We conclude that personality/learning preferences of individual students influence their use of CAI in the medical curriculum.

  11. Personality preference influences medical student use of specific computer-aided instruction (CAI)

    OpenAIRE

    Halsey Martha; Espiritu Baltazar; McNulty John A; Mendez Michelle

    2006-01-01

    Abstract Background The objective of this study was to test the hypothesis that personality preference, which can be related to learning style, influences individual utilization of CAI applications developed specifically for the undergraduate medical curriculum. Methods Personality preferences of students were obtained using the Myers-Briggs Type Indicator (MBTI) test. CAI utilization for individual students was collected from entry logs for two different web-based applications (a discussion ...

  12. Exploring Chondrule and CAI Rims Using Micro- and Nano-Scale Petrological and Compositional Analysis

    Science.gov (United States)

    Cartwright, J. A.; Perez-Huerta, A.; Leitner, J.; Vollmer, C.

    2017-12-01

    As the major components within chondrites, chondrules (mm-sized droplets of quenched silicate melt) and calcium-aluminum-rich inclusions (CAI, refractory) represent the most abundant and the earliest materials that solidified from the solar nebula. However, the exact formation mechanisms of these clasts, and whether these processes are related, remains unconstrained, despite extensive petrological and compositional study. By taking advantage of recent advances in nano-scale tomographical techniques, we have undertaken a combined micro- and nano-scale study of CAI and chondrule rim morphologies, to investigate their formation mechanisms. The target lithologies for this research are Wark-Lovering rims (WLR), and fine-grained rims (FGR) around CAIs and chondrules respectively, present within many chondrites. The FGRs, which are up to 100 µm thick, are of particular interest as recent studies have identified presolar grains within them. These grains predate the formation of our Solar System, suggesting FGR formation under nebular conditions. By contrast, WLRs are 10-20 µm thick, made of different compositional layers, and likely formed by flash-heating shortly after CAI formation, thus recording nebular conditions. A detailed multi-scale study of these respective rims will enable us to better understand their formation histories and determine the potential for commonality between these two phases, despite reports of an observed formation age difference of up to 2-3 Myr. We are using a combination of complimentary techniques on our selected target areas: 1) Micro-scale characterization using standard microscopic and compositional techniques (SEM-EBSD, EMPA); 2) Nano-scale characterization of structures using transmission electron microscopy (TEM) and elemental, isotopic and tomographic analysis with NanoSIMS and atom probe tomography (APT). Preliminary nano-scale APT analysis of FGR morphologies within the Allende carbonaceous chondrite has successfully discerned

  13. Quantum computational universality of the Cai-Miyake-Duer-Briegel two-dimensional quantum state from Affleck-Kennedy-Lieb-Tasaki quasichains

    International Nuclear Information System (INIS)

    Wei, Tzu-Chieh; Raussendorf, Robert; Kwek, Leong Chuan

    2011-01-01

    Universal quantum computation can be achieved by simply performing single-qubit measurements on a highly entangled resource state, such as cluster states. Cai, Miyake, Duer, and Briegel recently constructed a ground state of a two-dimensional quantum magnet by combining multiple Affleck-Kennedy-Lieb-Tasaki quasichains of mixed spin-3/2 and spin-1/2 entities and by mapping pairs of neighboring spin-1/2 particles to individual spin-3/2 particles [Phys. Rev. A 82, 052309 (2010)]. They showed that this state enables universal quantum computation by single-spin measurements. Here, we give an alternative understanding of how this state gives rise to universal measurement-based quantum computation: by local operations, each quasichain can be converted to a one-dimensional cluster state and entangling gates between two neighboring logical qubits can be implemented by single-spin measurements. We further argue that a two-dimensional cluster state can be distilled from the Cai-Miyake-Duer-Briegel state.

  14. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Directory of Open Access Journals (Sweden)

    Hsin-Yih Shyu

    1995-09-01

    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  15. CAI vs. Textbook for Grammar and Punctuation Skills.

    Science.gov (United States)

    Schramm, Robert M.; Rich, Grace E.

    1993-01-01

    Undergraduate control groups (n=45) completed textbook grammar exercises; experimental groups (n=53) used self-paced tutorial/drill-and-practice software. Although students using computer-assisted instruction (CAI) made significant improvement, they had reservations about the method. CAI combined with instructor interaction seem to be a feasible…

  16. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Directory of Open Access Journals (Sweden)

    Sunny S. J. Lin

    1988-03-01

    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  17. cai en Tijuana

    Directory of Open Access Journals (Sweden)

    Silvia López Estrada

    2007-01-01

    Full Text Available Debido a que en México se ha dado poca atención al cuidado infantil como tema central de política pública, el objetivo de este artículo es analizar las estrategias de cuidado infantil que las familias y el Estado llevan a cabo en la ciudad de Tijuana, para lo cual se toma como caso de estudio el proyecto Casas de Atención Infantil (CAI que formaba parte del Programa Jefas de Familia. En el contexto de la breve historia de las políticas de cuidado infantil en el país, se analizan los impactos de dicho proyecto para las mujeres que trabajan como madres educadoras, así como para los niños que acuden a las CAI. Para ello se considera el diseño del proyecto, una pequeña encuesta llevada a cabo con las mujeres jefas de familia, así como entrevistas con algunas de ellas. De acuerdo con los hallazgos, las CAI son un mecanismo de generación de ingresos para aliviar la pobreza que no expresa preocupación por los derechos de las mujeres, lo cual pone en cuestión su acceso a la ciudadanía.

  18. Maxi CAI with a Micro.

    Science.gov (United States)

    Gerhold, George; And Others

    This paper describes an effective microprocessor-based CAI system which has been repeatedly tested by a large number of students and edited accordingly. Tasks not suitable for microprocessor based systems (authoring, testing, and debugging) were handled on larger multi-terminal systems. This approach requires that the CAI language used on the…

  19. Study on Teaching Strategies in Mathematics Education based on CAI

    Directory of Open Access Journals (Sweden)

    Wei Yan Feng

    2016-01-01

    Full Text Available With the development of information technology and the popularization of internet, mobile phone, new media represented is gradually influencing and changing people’s study and life, become the centre and social consensus of cultural information, according to the China Internet Network Information centre, the youth is the main use of CAI(Computer Assisted Instruction, which is the most active group of customers, fully understand the impact of the new media environment for students, higher mathematics education of college students in CAI. In this paper, the CAI is proposed for mathematics education of college students.

  20. Coordinated Oxygen Isotopic and Petrologic Studies of CAIS Record Varying Composition of Protosolar

    Science.gov (United States)

    Simon, Justin I.; Matzel, J. E. P.; Simon, S. B.; Weber, P. K.; Grossman, L.; Ross, D. K.; Hutcheon, I. D.

    2012-01-01

    Ca-, Al-rich inclusions (CAIs) record the O-isotope composition of Solar nebular gas from which they grew [1]. High spatial resolution O-isotope measurements afforded by ion microprobe analysis across the rims and margin of CAIs reveal systematic variations in (Delta)O-17 and suggest formation from a diversity of nebular environments [2-4]. This heterogeneity has been explained by isotopic mixing between the O-16-rich Solar reservoir [6] and a second O-16-poor reservoir (probably nebular gas) with a "planetary-like" isotopic composition [e.g., 1, 6-7], but the mechanism and location(s) where these events occur within the protoplanetary disk remain uncertain. The orientation of large and systematic variations in (Delta)O-17 reported by [3] for a compact Type A CAI from the Efremovka reduced CV3 chondrite differs dramatically from reports by [4] of a similar CAI, A37 from the Allende oxidized CV3 chondrite. Both studies conclude that CAIs were exposed to distinct, nebular O-isotope reservoirs, implying the transfer of CAIs among different settings within the protoplanetary disk [4]. To test this hypothesis further and the extent of intra-CAI O-isotopic variation, a pristine compact Type A CAI, Ef-1 from Efremovka, and a Type B2 CAI, TS4 from Allende were studied. Our new results are equally intriguing because, collectively, O-isotopic zoning patterns in the CAIs indicate a progressive and cyclic record. The results imply that CAIs were commonly exposed to multiple environments of distinct gas during their formation. Numerical models help constrain conditions and duration of these events.

  1. Impact of CAI on Classroom Teachers

    Science.gov (United States)

    Hansen, Duncan N.; Harvey, William L.

    1970-01-01

    The authors "analyze some of the factors within CAI that may cause changes in teacher role, discuss "some of the obvious role functions that will undergo shifts as CAI becomes a more permanent activity, and "conclude with a brief statement on potential implications for teacher training. (Author/LS)

  2. A risk management approach to CAIS development

    Science.gov (United States)

    Hart, Hal; Kerner, Judy; Alden, Tony; Belz, Frank; Tadman, Frank

    1986-01-01

    The proposed DoD standard Common APSE Interface Set (CAIS) was developed as a framework set of interfaces that will support the transportability and interoperability of tools in the support environments of the future. While the current CAIS version is a promising start toward fulfilling those goals and current prototypes provide adequate testbeds for investigations in support of completing specifications for a full CAIS, there are many reasons why the proposed CAIS might fail to become a usable product and the foundation of next-generation (1990'S) project support environments such as NASA's Space Station software support environment. The most critical threats to the viability and acceptance of the CAIS include performance issues (especially in piggybacked implementations), transportability, and security requirements. To make the situation worse, the solution to some of these threats appears to be at conflict with the solutions to others.

  3. Study on Teaching Strategies in Mathematics Education based on CAI

    OpenAIRE

    Wei Yan Feng

    2016-01-01

    With the development of information technology and the popularization of internet, mobile phone, new media represented is gradually influencing and changing people’s study and life, become the centre and social consensus of cultural information, according to the China Internet Network Information centre, the youth is the main use of CAI(Computer Assisted Instruction), which is the most active group of customers, fully understand the impact of the new media environment for students, higher mat...

  4. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  5. Generative CAI in Analytical Geometry.

    Science.gov (United States)

    Uttal, William R.; And Others

    A generative computer-assisted instruction system is being developed to tutor students in analytical geometry. The basis of this development is the thesis that a generative teaching system can be developed by establishing and then stimulating a simplified, explicit model of the human tutor. The goal attempted is that of a computer environment…

  6. CaiT of Escherichia coli, a new transporter catalyzing L-carnitine/gamma -butyrobetaine exchange.

    Science.gov (United States)

    Jung, Heinrich; Buchholz, Marion; Clausen, Jurgen; Nietschke, Monika; Revermann, Anne; Schmid, Roland; Jung, Kirsten

    2002-10-18

    l-Carnitine is essential for beta-oxidation of fatty acids in mitochondria. Bacterial metabolic pathways are used for the production of this medically important compound. Here, we report the first detailed functional characterization of the caiT gene product, a putative transport protein whose function is required for l-carnitine conversion in Escherichia coli. The caiT gene was overexpressed in E. coli, and the gene product was purified by affinity chromatography and reconstituted into proteoliposomes. Functional analyses with intact cells and proteoliposomes demonstrated that CaiT is able to catalyze the exchange of l-carnitine for gamma-butyrobetaine, the excreted end product of l-carnitine conversion in E. coli, and related betaines. Electrochemical ion gradients did not significantly stimulate l-carnitine uptake. Analysis of l-carnitine counterflow yielded an apparent external K(m) of 105 microm and a turnover number of 5.5 s(-1). Contrary to related proteins, CaiT activity was not modulated by osmotic stress. l-Carnitine binding to CaiT increased the protein fluorescence and caused a red shift in the emission maximum, an observation explained by ligand-induced conformational alterations. The fluorescence effect was specific for betaine structures, for which the distance between trimethylammonium and carboxyl groups proved to be crucial for affinity. Taken together, the results suggest that CaiT functions as an exchanger (antiporter) for l-carnitine and gamma-butyrobetaine according to the substrate/product antiport principle.

  7. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  8. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-08

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  9. COMPARISON OF GOSAT CAI AND SPOT VGT NDVI DATA WITH DIFFERENT SEASON AND LAND COVER IN EAST ASIA

    Directory of Open Access Journals (Sweden)

    Y. Liu

    2012-08-01

    Full Text Available The Normalized Difference Vegetation Index (NDVI has become one of the most widely used indices in remote sensing applications in a variety of fields. Many studies have compared the NDVI values for different satellite sensors. Nowadays, the Greenhouse Gases Observing Satellite (GOSAT was successfully launched on January 23, 2009. It is used to monitor greenhouse gases on the Earth's surface and also has a sensor, the Cloud Aerosol Imager (CAI, that senses red and near infrared spectrums. It can also process NDVI data. Therefore, we are first compare GOSAT CAI and SPOT VGT NDVI data in different seasonal and land cover in East Asian, to explore the relationship between the two types of datasets, and to discuss the possibility of extending SPOT VGT data using GOSAT CAI NDVI data for the same area. We used GOSAT CAI Level 3 data to derive 10–day composite NDVI values for the East Asia region for November 2009 and January, April and July 2010 using the maximum value composite (MVC method. We compared these values with 10–day composite SPOT VGT NDVI data for the same period. The results show that the correlation coefficients of regression analysis generally revealed a strong correlation between NDVI from the two sensors in November 2009 and January, April and July 2010 (0.88, 0.85, 0.77 and 0.74, respectively. The differences place may be affected by cloud cover. From the combined analysis of seasonal changes and land cover, we found that the correlations between the SPOT VGT and the GOSAT CAI NDVI data are less affected by seasonal change and the SPOT VGT data is more sensitive to high vegetation coverage than the GOSAT CAI data. In the future, through continued monitoring and processing by cloud removal technology, the accuracy of GOSAT CAI NDVI data will be further improved and thus be more widely used.

  10. Computer Series, 9: Bits and Pieces, 2.

    Science.gov (United States)

    Moore, John W., Ed.

    1980-01-01

    Described are two interactive computer-assisted-instruction (CAI) programs that were developed for use in quantitative analysis and in general chemistry. The purpose of these modules is to help students learn to calculate the points along a titration curve for the titration of an acid with a strong base. (Author/DS)

  11. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    Fujii, M.; Sasaki, K.; Ohi, T.; Itoh, T.

    1993-01-01

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  12. A Model Driven Question-Answering System for a CAI Environment. Final Report (July 1970 to May 1972).

    Science.gov (United States)

    Brown, John S.; And Others

    A question answering system which permits a computer-assisted instruction (CAI) student greater initiative in the variety of questions he can ask is described. A method is presented to represent the dynamic processes of a subject matter area by augmented finite state automata, which permits efficient inferencing about dynamic processes and…

  13. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  14. Dansk dyrlæge i Lao Cai

    DEFF Research Database (Denmark)

    Dalsgaard, Anders

    2009-01-01

    Louise Poulsen forlod danske køer med yverbetændelse for at forske i smitsomme sygdomme blandt etniske minoriteter og deres husdyr i Lao Cai-provinsen i Vietnam......Louise Poulsen forlod danske køer med yverbetændelse for at forske i smitsomme sygdomme blandt etniske minoriteter og deres husdyr i Lao Cai-provinsen i Vietnam...

  15. Common Airborne Instrumentation System: CAIS Bus Interface Standard, A00.00-C001. Revision A

    National Research Council Canada - National Science Library

    Jones, Sidney

    1997-01-01

    .... This interface control document (ICD) was written to provide a single document that designers of CAIS bus controllers and data acquisition units could reference to ensure interoperability on the CAIS bus. This ICD establishes the requirements for digital command/response, time division multiplexing techniques for a single CAIS bus. It encompasses the physical, electrical, and protocol aspects of the CAIS bus.

  16. Common Airborne Instrumentation System; CAIS Configuration ID List A00.00-C009. Revision A

    National Research Council Canada - National Science Library

    Jones, Sidney

    1997-01-01

    .... This interface control document (ICD) was written to provide a single document that designers of CAIS bus controllers and data acquisition units could reference to ensure interoperability on the CAIS bus. This ICD establishes the requirements for digital command/response, time division multiplexing techniques for a single CAIS bus. It encompasses the physical, electrical, and protocol aspects of the CAIS bus.

  17. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  18. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  19. Superimposed Code Theorectic Analysis of DNA Codes and DNA Computing

    Science.gov (United States)

    2010-03-01

    Massively Parallel Signature Sequencing ( MPSS ) on Microbead Arrarys”, Nat. Biotechnol., 18, 2000, pp. 630-634. 11. Cai, H., P. White, D. Torney, A...for Sorting Polynucleotides Using Oligonucleotide Tags”, U.S. Patent No. 5,604,097, 1997 10. Brenner, S. et al., “ Gene Expression Analysis by...addresses how the massive parallelism of DNA hybridization reactions can be exploited to construct a DNA based associative memory. Single

  20. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  1. Compupoem: CAI for Writing and Studying Poetry.

    Science.gov (United States)

    Marcus, Stephen

    1982-01-01

    Describes a computer program that prompts the user for different parts of speech and formats the words in a haiku-like poetic structure. (Available from "The Computing Teacher," Department of Computer and Information Science, University of Oregon, Eugene, OR 97403.) (AEA)

  2. Computer based training for nuclear operations personnel: From concept to reality

    International Nuclear Information System (INIS)

    Widen, W.C.; Klemm, R.W.

    1986-01-01

    Computer Based Training (CBT) can be subdivided into two categories: Computer Aided Instruction (CAI), or the actual presentation of learning material; and Computer Managed Instruction (CMI), the tracking, recording, and documenting of instruction and student progress. Both CAI and CMI can be attractive to the student and to the training department. A brief overview of CAI and CMI benefits is given in this paper

  3. CAI and training system for the emergency operation procedure in the advanced thermal reactor, FUGEN

    International Nuclear Information System (INIS)

    Kozaki, T.; Imanaga, K.; Nakamura, S.; Maeda, K.; Sakurai, N.; Miyamoto, M.

    2003-01-01

    In the Advanced Thermal Reactor (ATR ) of the JNC, 'FUGEN', a symptom based Emergency Operating Procedure (EOF) was introduced in order to operate Fugen more safely and it became necessary for the plant operators to master the EOF. However it took a lot of time for the instructor to teach the EOP to operators and to train them. Thus, we have developed a Computer Aided Instruction (CAI) and Training System for the EOP, by which the operators can learn the EOP and can be trained. This system has two major functions, i.e., CAI and training. In the CAI function, there are three learning courses, namely, the EOP procedure, the simulation with guidance and Q and A, and the free simulation. In the training function, all of necessary control instruments (indicators, switches, annunciators and so forth) and physics models for the EOP training are simulated so that the trainees can be trained for all of the EOPs. In addition, 50 kinds of malfunction models are installed in order to perform appropriate accident scenarios for the EOP. The training of the EOP covers the range from AOO (Anticipated Operational Occurrence) to Over-DBAs (Design Based Accidents). This system is built in three personal computers that are connected by the computer network. One of the computers is expected to be used for the instructor and the other two are for the trainees. The EOP is composed of eight guidelines, such as 'Reactor Control' and 'Depression and Cooling', and the operation screens which are corresponded to the guidelines are respectively provided. According to the trial, we have estimated that the efficiency of the learning and the training would be improved about 30% for the trainee and about 75% for the instructor in the actual learning and training. (author)

  4. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  5. Adaptation of an aerosol retrieval algorithm using multi-wavelength and multi-pixel information of satellites (MWPM) to GOSAT/TANSO-CAI

    Science.gov (United States)

    Hashimoto, M.; Takenaka, H.; Higurashi, A.; Nakajima, T.

    2017-12-01

    Aerosol in the atmosphere is an important constituent for determining the earth's radiation budget, so the accurate aerosol retrievals from satellite is useful. We have developed a satellite remote sensing algorithm to retrieve the aerosol optical properties using multi-wavelength and multi-pixel information of satellite imagers (MWPM). The method simultaneously derives aerosol optical properties, such as aerosol optical thickness (AOT), single scattering albedo (SSA) and aerosol size information, by using spatial difference of wavelegths (multi-wavelength) and surface reflectances (multi-pixel). The method is useful for aerosol retrieval over spatially heterogeneous surface like an urban region. In this algorithm, the inversion method is a combination of an optimal method and smoothing constraint for the state vector. Furthermore, this method has been combined with the direct radiation transfer calculation (RTM) numerically solved by each iteration step of the non-linear inverse problem, without using look up table (LUT) with several constraints. However, it takes too much computation time. To accelerate the calculation time, we replaced the RTM with an accelerated RTM solver learned by neural network-based method, EXAM (Takenaka et al., 2011), using Rster code. And then, the calculation time was shorternd to about one thouthandth. We applyed MWPM combined with EXAM to GOSAT/TANSO-CAI (Cloud and Aerosol Imager). CAI is a supplement sensor of TANSO-FTS, dedicated to measure cloud and aerosol properties. CAI has four bands, 380, 674, 870 and 1600 nm, and observes in 500 meters resolution for band1, band2 and band3, and 1.5 km for band4. Retrieved parameters are aerosol optical properties, such as aerosol optical thickness (AOT) of fine and coarse mode particles at a wavelenth of 500nm, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength by combining a minimum reflectance method and Fukuda et al. (2013). We will show

  6. Sexual life and sexual wellness in individuals with complete androgen insensitivity syndrome (CAIS) and Mayer-Rokitansky-Küster-Hauser Syndrome (MRKHS).

    Science.gov (United States)

    Fliegner, Maike; Krupp, Kerstin; Brunner, Franziska; Rall, Katharina; Brucker, Sara Y; Briken, Peer; Richter-Appelt, Hertha

    2014-03-01

    Sexual wellness depends on a person's physical and psychological constitution. Complete Androgen Insensitivity Syndrome (CAIS) and Mayer-Rokitansky-Küster-Hauser Syndrome (MRKHS) can compromise sexual well-being. To compare sexual well-being in CAIS and MRKHS using multiple measures: To assess sexual problems and perceived distress. To gain insight into participants' feelings of inadequacy in social and sexual situations, level of self-esteem and depression. To determine how these psychological factors relate to sexual (dys)function. To uncover what participants see as the source of their sexual problems. Data were collected using a paper-and-pencil questionnaire. Eleven individuals with CAIS and 49 with MRKHS with/without neovagina treatment were included. Rates of sexual dysfunctions, overall sexual function, feelings of inadequacy in social and sexual situations, self-esteem and depression scores were calculated. Categorizations were used to identify critical cases. Correlations between psychological variables and sexual function were computed. Sexually active subjects were compared with sexually not active participants. A qualitative content analysis was carried out to explore causes of sexual problems. An extended list of sexual problems based on the Diagnostic and Statistical Manual of Mental Disorders, 4th ed., text revision, by the American Psychiatric Association and related distress. Female Sexual Function Index (FSFI), German Questionnaire on Feelings of Inadequacy in Social and Sexual Situations (FUSS social scale, FUSS sexual scale), Rosenberg Self-Esteem Scale (RSE), Brief Symptom Inventory (BSI) subscale depression. Open question on alleged causes of sexual problems. The results point to a far-reaching lack of sexual confidence and sexual satisfaction in CAIS. In MRKHS apprehension in sexual situations is a source of distress, but sexual problems seem to be more focused on issues of vaginal functioning. MRKHS women report being satisfied with their

  7. Power Computations for Intervention Analysis

    Science.gov (United States)

    McLeod, A. I.; Vingilis, E. R.

    2009-01-01

    In many intervention analysis applications, time series data may be expensive or otherwise difficult to collect. In this case the power function is helpful, because it can be used to determine the probability that a proposed intervention analysis application will detect a meaningful change. Assuming that an underlying autoregressive integrated moving average (ARIMA) or fractional ARIMA model is known or can be estimated from the preintervention time series, the methodology for computing the required power function is developed for pulse, step, and ramp interventions with ARIMA and fractional ARIMA errors. Convenient formulas for computing the power function for important special cases are given. Illustrative applications in traffic safety and environmental impact assessment are discussed. PMID:19629193

  8. Developing a CAI Graphics Simulation Model: Guidelines.

    Science.gov (United States)

    Strickland, R. Mack; Poe, Stephen E.

    1989-01-01

    Discusses producing effective instructional software using a balance of course content and technological capabilities. Describes six phases of an instructional development model: discovery, design, development, coding, documentation, and delivery. Notes that good instructional design should have learner/computer interaction, sequencing of…

  9. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  10. Oxygen Isotope Measurements of a Rare Murchison Type A CAI and Its Rim

    Science.gov (United States)

    Matzel, J. E. P.; Simon, J. I.; Hutcheon, I. D.; Jacobsen, B.; Simon, S. B.; Grossman, L.

    2013-01-01

    Ca-, Al-rich inclusions (CAIs) from CV chondrites commonly show oxygen isotope heterogeneity among different mineral phases within individual inclusions reflecting the complex history of CAIs in both the solar nebula and/or parent bodies. The degree of isotopic exchange is typically mineral-specific, yielding O-16-rich spinel, hibonite and pyroxene and O-16-depleted melilite and anorthite. Recent work demonstrated large and systematic variations in oxygen isotope composition within the margin and Wark-Lovering rim of an Allende Type A CAI. These variations suggest that some CV CAIs formed from several oxygen reservoirs and may reflect transport between distinct regions of the solar nebula or varying gas composition near the proto-Sun. Oxygen isotope compositions of CAIs from other, less-altered chondrites show less intra-CAI variability and 16O-rich compositions. The record of intra-CAI oxygen isotope variability in CM chondrites, which commonly show evidence for low-temperature aqueous alteration, is less clear, in part because the most common CAIs found in CM chondrites are mineralogically simple (hibonite +/- spinel or spinel +/- pyroxene) and are composed of minerals less susceptible to O-isotopic exchange. No measurements of the oxygen isotope compositions of rims on CAIs in CM chondrites have been reported. Here, we present oxygen isotope data from a rare, Type A CAI from the Murchison meteorite, MUM-1. The data were collected from melilite, hibonite, perovskite and spinel in a traverse into the interior of the CAI and from pyroxene, melilite, anorthite, and spinel in the Wark-Lovering rim. Our objectives were to (1) document any evidence for intra-CAI oxygen isotope variability; (2) determine the isotopic composition of the rim minerals and compare their composition(s) to the CAI interior; and (3) compare the MUM-1 data to oxygen isotope zoning profiles measured from CAIs in other chondrites.

  11. Relationship between Pre-Service Music Teachers' Personality and Motivation for Computer-Assisted Instruction

    Science.gov (United States)

    Perkmen, Serkan; Cevik, Beste

    2010-01-01

    The main purpose of this study was to examine the relationship between pre-service music teachers' personalities and their motivation for computer-assisted music instruction (CAI). The "Big Five" Model of Personality served as the framework. Participants were 83 pre-service music teachers in Turkey. Correlation analysis revealed that three…

  12. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  13. Silicon Isotopic Fractionation of CAI-like Vacuum Evaporation Residues

    Energy Technology Data Exchange (ETDEWEB)

    Knight, K; Kita, N; Mendybaev, R; Richter, F; Davis, A; Valley, J

    2009-06-18

    Calcium-, aluminum-rich inclusions (CAIs) are often enriched in the heavy isotopes of magnesium and silicon relative to bulk solar system materials. It is likely that these isotopic enrichments resulted from evaporative mass loss of magnesium and silicon from early solar system condensates while they were molten during one or more high-temperature reheating events. Quantitative interpretation of these enrichments requires laboratory determinations of the evaporation kinetics and associated isotopic fractionation effects for these elements. The experimental data for the kinetics of evaporation of magnesium and silicon and the evaporative isotopic fractionation of magnesium is reasonably complete for Type B CAI liquids (Richter et al., 2002, 2007a). However, the isotopic fractionation factor for silicon evaporating from such liquids has not been as extensively studied. Here we report new ion microprobe silicon isotopic measurements of residual glass from partial evaporation of Type B CAI liquids into vacuum. The silicon isotopic fractionation is reported as a kinetic fractionation factor, {alpha}{sub Si}, corresponding to the ratio of the silicon isotopic composition of the evaporation flux to that of the residual silicate liquid. For CAI-like melts, we find that {alpha}{sub Si} = 0.98985 {+-} 0.00044 (2{sigma}) for {sup 29}Si/{sup 28}Si with no resolvable variation with temperature over the temperature range of the experiments, 1600-1900 C. This value is different from what has been reported for evaporation of liquid Mg{sub 2}SiO{sub 4} (Davis et al., 1990) and of a melt with CI chondritic proportions of the major elements (Wang et al., 2001). There appears to be some compositional control on {alpha}{sub Si}, whereas no compositional effects have been reported for {alpha}{sub Mg}. We use the values of {alpha}Si and {alpha}Mg, to calculate the chemical compositions of the unevaporated precursors of a number of isotopically fractionated CAIs from CV chondrites whose

  14. Numerical analysis mathematics of scientific computing

    CERN Document Server

    Kincaid, David

    2009-01-01

    This book introduces students with diverse backgrounds to various types of mathematical analysis that are commonly needed in scientific computing. The subject of numerical analysis is treated from a mathematical point of view, offering a complete analysis of methods for scientific computing with appropriate motivations and careful proofs. In an engaging and informal style, the authors demonstrate that many computational procedures and intriguing questions of computer science arise from theorems and proofs. Algorithms are presented in pseudocode, so that students can immediately write computer

  15. Recursive harmonic analysis for computing Hansen coefficients

    Science.gov (United States)

    Adel Sharaf, Mohamed; Hassan Selim, Hadia

    2010-12-01

    We report on a simple pure numerical method developed for computing Hansen coefficients by using a recursive harmonic analysis technique. The precision criteria of the computations are very satisfactory and provide materials for computing Hansen's and Hansen's like expansions, and also to check the accuracy of some existing algorithms.

  16. Computer Assisted Instruction

    Science.gov (United States)

    Higgins, Paul

    1976-01-01

    Methodology for developing a computer assisted instruction (CAI) lesson (scripting, programing, and testing) is reviewed. A project done by Informatics Education Ltd. (IEL) for the Department of National Defense (DND) is used as an example. (JT)

  17. CAI combustion with methanol and ethanol in an air-assisted direct injection SI engine

    OpenAIRE

    Li, Y; Zhao, H; Brouzos, NP

    2008-01-01

    Copyright © 2009 SAE International. This paper is posted on this site with permission from SAE International. Further use of this paper is not permitted without permission from SAE CAI combustion has the potential to be the most clean combustion technology in internal combustion engines and is being intensively researched. Following the previous research on CAI combustion of gasoline fuel, systematic investigation is being carried out on the application of bio-fuels in CAI combustion. As p...

  18. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  19. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  20. Computationally efficient analysis procedure for frames with ...

    African Journals Online (AJOL)

    Computationally-efficient analytical procedure that provides high-quality analysis results for two-dimensional skeletal structure with segmented (stepped) and linearly-tapered non-prismatic flexural members has been developed based on the stiffness method of structural analysis. A computer program coded in FORTRAN ...

  1. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  2. Numerical Investigation Into Effect of Fuel Injection Timing on CAI/HCCI Combustion in a Four-Stroke GDI Engine

    Science.gov (United States)

    Cao, Li; Zhao, Hua; Jiang, Xi; Kalian, Navin

    2006-02-01

    The Controlled Auto-Ignition (CAI) combustion, also known as Homogeneous Charge Compression Ignition (HCCI), was achieved by trapping residuals with early exhaust valve closure in conjunction with direct injection. Multi-cycle 3D engine simulations have been carried out for parametric study on four different injection timings in order to better understand the effects of injection timings on in-cylinder mixing and CAI combustion. The full engine cycle simulation including complete gas exchange and combustion processes was carried out over several cycles in order to obtain the stable cycle for analysis. The combustion models used in the present study are the Shell auto-ignition model and the characteristic-time combustion model, which were modified to take the high level of EGR into consideration. A liquid sheet breakup spray model was used for the droplet breakup processes. The analyses show that the injection timing plays an important role in affecting the in-cylinder air/fuel mixing and mixture temperature, which in turn affects the CAI combustion and engine performance.

  3. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    Science.gov (United States)

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  4. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  5. The Effects of Computer-Assisted Instruction Based on Top-Level Structure Method in English Reading and Writing Abilities of Thai EFL Students

    Science.gov (United States)

    Jinajai, Nattapong; Rattanavich, Saowalak

    2015-01-01

    This research aims to study the development of ninth grade students' reading and writing abilities and interests in learning English taught through computer-assisted instruction (CAI) based on the top-level structure (TLS) method. An experimental group time series design was used, and the data was analyzed by multivariate analysis of variance…

  6. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.

  7. Computable Analysis with Applications to Dynamic Systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter)

    2010-01-01

    htmlabstractIn this article we develop a theory of computation for continuous mathematics. The theory is based on earlier developments of computable analysis, especially that of the school of Weihrauch, and is presented as a model of intuitionistic type theory. Every effort has been made to keep the

  8. COMPUTER PROGRAMME FOR THE DYNAMIC ANALYSIS

    African Journals Online (AJOL)

    ES OBE

    ABSTRACT. The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines ...

  9. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  10. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  11. [Study on the subcloning, expression and immunoprotection with Schistosoma japonicum CAI gene].

    Science.gov (United States)

    Luo, Xiu-ju; Yuan, Shi-shan; Yi, Xin-yuan; Zeng, Xian-fang; Zhang, Shun-ke; Tang, Lian-fei; Cai, Chun; Zhang, Jie; McReynolds, Larry

    2003-01-01

    To subclone and express the new gene of Schistosoma japonicum (Sj) CAI and evaluate the immunoprotective effect of the recombinant molecule. The cDNA of SjCAI gene was subcloned into expression vector pGEX-5X-3 to form recombinants which were then used to transform to E. coli strain ER 2566. Expression was induced by IPTG. The mice were vaccinated with the expressed protein and the immunoprotective effect was tested. Fusion protein of SjGST-CAI was highly expressed in E. coli as inclusion bodies. The worm reduction rate and the liver egg reduction rate in vaccination group of SjGST-CAI were 29.87% and 63.71%, respectively. SjCAI gene can be highly expressed in E. coli after subcloning into pGEX-5X-3 vector and the expressed fusion protein can induce immunoprotective effect against Sj in mice.

  12. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  13. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  14. Computer-aided structure analysis

    International Nuclear Information System (INIS)

    Szalontai, G.; Simon, Z.; Csapo, Z.; Farkas, M.; Pfeifer, Gy.

    1980-01-01

    The results obtained from the computer-aided interpretation of 13 C NMR and IR spectra using the artificial intelligence approach are presented. In its present state the output of the system is a list of functional groups which are resonable candidates for the final structural isomers. The input requires empirical formula, 13 C NMR data (off resonance data also) and IR spectral data. The confirmation of the presence of a functional group is based on comparison of the experimental data with the spectral properties of functional groups stored in a property matrix. If the molecular weight of the compounds studied is less or equal 500, the output contains usually 1.5-2.5 times more groups than really present, in most cases without the loss of the real ones. (author)

  15. The Influence of Students' Orientation Toward College and College Major Upon Students' Attitudes and Performance in a Computer-Based Education Course.

    Science.gov (United States)

    Kevin, Richard C.; Liberty, Paul G.

    Research related students' orientation toward college and college majors to their attitudes toward computer-assisted instruction (CAI) and their performance in an organic chemistry course using CAI. Major findings were that: 1) students majoring in applied fields manifested more favorable attitudes toward both organic chemistry and CAI, than did…

  16. Present status of structural analysis computer programs

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Sanokawa, Konomo; Takeda, Hiroshi.

    1981-01-01

    The computer programs for the structural analysis by finite element method have been used widely, and the authors carried out the bench mark test on the computer programs for finite element method already. As the result, they pointed out a number of problems concerning the use of the computer programs for finite element method. In this paper, the details of their development, the analytical function and the examples of calculation are described centering around the versatile computer programs used for the previous study. As the versatile computer programs for finite element method, ANSYS developed by Swanson Analysis System Co., USA, ASKA developed by ISD, West Germany, MARC developed by MARC Analysis Research Institute, NASTRAN developed by NASA, USA, SAP-4 developed by University of California, ADINA developed by MIT, NEPSAP developed by Lockheed Missile Space Co., BERSAFE developed by CEGB, Great Britain, EPACA developed by Franklin Research Institute, USA, and CREEP-PLAST developed by GE are briefly introduced. As the exampled of calculation, the thermal elastoplastic creep analysis of a cylinder by ANSYS, the elastoplastic analysis of a pressure vessel by ASKA, the analysis of a plate with double cracks by MARC, the analysis of the buckling of a shallow arch by MSC-NASTRAN, and the elastoplastic analysis of primary cooling pipes by ADINA are explained. (Kako, I.)

  17. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  18. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H.A.E.; Roesler, H.

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  19. CAI in Writing at the University: Some Recommendations.

    Science.gov (United States)

    Bump, Jerome

    1987-01-01

    Reviews the use of computers in the teaching and practice of writing in university courses. Highlights include hardware considerations; software options for grammar, punctuation, stylistic analysis, and prewriting (or invention) skills; artificial intelligence, composition, and literature; collaborative learning; and faculty development. (LRW)

  20. DFT computational analysis of piracetam

    Science.gov (United States)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  1. Superimposed Code Theoretic Analysis of Deoxyribonucleic Acid (DNA) Codes and DNA Computing

    Science.gov (United States)

    2010-01-01

    Polynucleotides Using Oligonucleotide Tags”, U.S. Patent No. 5,604,097, 1997 10. Brenner, S. et al., “ Gene Expression Analysis by Massively ... Parallel Signature Sequencing ( MPSS ) on Microbead Arrarys”, Nat. Biotechnol., 18, 2000, pp. 630-634. 11. Cai, H., P. White, D. Torney, A. Deshpande, Z... massive parallelism of DNA hybridization reactions can be exploited to construct a DNA based associative memory. Single strands of DNA are

  2. Comparative Analysis of Computer Network Security Scanners

    Directory of Open Access Journals (Sweden)

    Victor Sergeevich Gorbatov

    2013-02-01

    Full Text Available The paper is devoted to the analysis of the problem of comparison of security scanners computer network. A common comprehensive assessment of security control is developed on the base of comparative analysis of data security controls. We have tested security scanners available on the market.

  3. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  4. Computer aided stress analysis of long bones utilizing computer tomography

    International Nuclear Information System (INIS)

    Marom, S.A.

    1986-01-01

    A computer aided analysis method, utilizing computed tomography (CT) has been developed, which together with a finite element program determines the stress-displacement pattern in a long bone section. The CT data file provides the geometry, the density and the material properties for the generated finite element model. A three-dimensional finite element model of a tibial shaft is automatically generated from the CT file by a pre-processing procedure for a finite element program. The developed pre-processor includes an edge detection algorithm which determines the boundaries of the reconstructed cross-sectional images of the scanned bone. A mesh generation procedure than automatically generates a three-dimensional mesh of a user-selected refinement. The elastic properties needed for the stress analysis are individually determined for each model element using the radiographic density (CT number) of each pixel with the elemental borders. The elastic modulus is determined from the CT radiographic density by using an empirical relationship from the literature. The generated finite element model, together with applied loads, determined from existing gait analysis and initial displacements, comprise a formatted input for the SAP IV finite element program. The output of this program, stresses and displacements at the model elements and nodes, are sorted and displayed by a developed post-processor to provide maximum and minimum values at selected locations in the model

  5. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  6. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  7. ANALYSIS OF COMPUTER AIDED PROCESS PLANNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Salim A. Saleh

    2013-05-01

    Full Text Available Computer Aided Process Planning ( CAPP has been recognized as playing a key role in Computer Integrated Manufacturing ( CIM . It was used as a bridge to link CAD with CAM systems, in order to give the possibility of full integration in agreement with computer engineering to introduce CIM. The benefits of CAPP in the real industrial environment are still to be achieved. Due to different manufacturing applications, many different CAPP systems have been developed. The development of CAPP techniques needs to a summarized classification and a descriptive analysis. This paper presents the most important and famous techniques for the available CAPP systems, which are based on the variant, generative or semi-generative methods, and a descriptive analysis of their application possibilities.

  8. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  9. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  10. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    Directory of Open Access Journals (Sweden)

    V. V. Postnov

    2014-07-01

    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  11. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  12. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  13. Computational intelligent data analysis for sustainable development computational intelligent data analysis for sustainable development

    CERN Document Server

    Yu, Ting; Simoff, Simeon

    2016-01-01

    Computational Intelligent Data Analysis for Sustainable Development: An Introduction and Overview Ting Yu, Nitesh Chawla, and Simeon SimoffIntegrated Sustainability AnalysisTracing Embodied CO2 in Trade Using High-Resolution Input-Output Tables Daniel Moran and Arne GeschkeAggregation Effects in Carbon Footprint Accounting Using Multi-Region Input-Output Analysis Xin Zhou, Hiroaki Shirakawa, and Manfred LenzenComputational Intelligent Data Analysis for Climate ChangeClimate InformaticsClaire Monteleoni, Gavin A. Schmidt, Francis Alexander, Alexandru Niculescu-Mizil, Karsten Steinhaeuser, Michael Tippett, Arindam Banerjee, M. Benno Blumenthal, Auroop R. Ganguly, Jason E. Smerdon, and Marco TedescoComputational Data Sciences for Actionable Insights on Climate Extremes and Uncertainty Auroop R. Ganguly, Evan Kodra, Snigdhansu Chatterjee, Arindam Banerjee, and Habib N. NajmComputational Intelligent Data Analysis for Biodiversity and Species ConservationMathematical Programming Applications to Land Conservation an...

  14. Computer aided analysis of reactor containment building

    International Nuclear Information System (INIS)

    Abhat, O.B.

    1985-01-01

    The paper presents a computerized, structural analysis of the Reactor Containment Building (RCB) of a liquid metal nuclear power plant. Emphasis is placed on developing techniques for large 3-Dimensional, structural models using interactive computer graphics, analyzing the structure by NASTRAN and a simplified analytical approach to account for non-linear, temperature-dependent, material properties at elevated temperatures caused by a sodium spill from a Design Basis Accident (DBA). (orig.)

  15. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  16. A computer program for spectrochemical analysis

    International Nuclear Information System (INIS)

    Sastry, M.D.; Page, A.G.; Joshi, B.D.

    1976-01-01

    A simple and versatile computer program has been developed in FORTRAN IV for the routine analysis of the metallic impurities by emission spectrographic method. From the optical densities, the transformed transmittances, using Kaiser transformation, have been obtained such that they are linearly related to exposure. The background correction for the spectral lines has been carried out using Gauss differential logaritham method. In addition to the final analysis results in terms of the concentration of element in PPM, the advantages of the program include the printout of concentration and intensity ratios and a graphical presentation of working curves log (concentration) vs log (intensity ratio). (author)

  17. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Qin, Peng

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  18. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  19. Computer-Assisted Mathematics Instruction for Students with Specific Learning Disability: A Review of the Literature

    Science.gov (United States)

    Stultz, Sherry L.

    2017-01-01

    This review was conducted to evaluate the current body of scholarly research regarding the use of computer-assisted instruction (CAI) to teach mathematics to students with specific learning disability (SLD). For many years, computers are utilized for educational purposes. However, the effectiveness of CAI for teaching mathematics to this specific…

  20. A multielement isotopic study of refractory FUN and F CAIs: Mass-dependent and mass-independent isotope effects

    Science.gov (United States)

    Kööp, Levke; Nakashima, Daisuke; Heck, Philipp R.; Kita, Noriko T.; Tenner, Travis J.; Krot, Alexander N.; Nagashima, Kazuhide; Park, Changkun; Davis, Andrew M.

    2018-01-01

    Calcium-aluminum-rich inclusions (CAIs) are the oldest dated objects that formed inside the Solar System. Among these are rare, enigmatic objects with large mass-dependent fractionation effects (F CAIs), which sometimes also have large nucleosynthetic anomalies and a low initial abundance of the short-lived radionuclide 26Al (FUN CAIs). We have studied seven refractory hibonite-rich CAIs and one grossite-rich CAI from the Murchison (CM2) meteorite for their oxygen, calcium, and titanium isotopic compositions. The 26Al-26Mg system was also studied in seven of these CAIs. We found mass-dependent heavy isotope enrichment in all measured elements, but never simultaneously in the same CAI. The data are hard to reconcile with a single-stage melt evaporation origin and may require reintroduction or reequilibration for magnesium, oxygen and titanium after evaporation for some of the studied CAIs. The initial 26Al/27Al ratios inferred from model isochrons span a range from <1 × 10-6 to canonical (∼5 × 10-5). The CAIs show a mutual exclusivity relationship between inferred incorporation of live 26Al and the presence of resolvable anomalies in 48Ca and 50Ti. Furthermore, a relationship exists between 26Al incorporation and Δ17O in the hibonite-rich CAIs (i.e., 26Al-free CAIs have resolved variations in Δ17O, while CAIs with resolved 26Mg excesses have Δ17O values close to -23‰). Only the grossite-rich CAI has a relatively enhanced Δ17O value (∼-17‰) in spite of a near-canonical 26Al/27Al. We interpret these data as indicating that fractionated hibonite-rich CAIs formed over an extended time period and sampled multiple stages in the isotopic evolution of the solar nebula, including: (1) an 26Al-poor nebula with large positive and negative anomalies in 48Ca and 50Ti and variable Δ17O; (2) a stage of 26Al-admixture, during which anomalies in 48Ca and 50Ti had been largely diluted and a Δ17O value of ∼-23‰ had been achieved in the CAI formation region; and (3

  1. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  2. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  3. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  4. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  5. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  6. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  7. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  8. Closed system oxygen isotope redistribution in igneous CAIs upon spinel dissolution

    Science.gov (United States)

    Aléon, Jérôme

    2018-01-01

    In several Calcium-Aluminum-rich Inclusions (CAIs) from the CV3 chondrites Allende and Efremovka, representative of the most common igneous CAI types (type A, type B and Fractionated with Unknown Nuclear isotopic anomalies, FUN), the relationship between 16O-excesses and TiO2 content in pyroxene indicates that the latter commonly begins to crystallize with a near-terrestrial 16O-poor composition and becomes 16O-enriched during crystallization, reaching a near-solar composition. Mass balance calculations were performed to investigate the contribution of spinel to this 16O-enrichment. It is found that a back-reaction of early-crystallized 16O-rich spinel with a silicate partial melt having undergone a 16O-depletion is consistent with the O isotopic evolution of CAI minerals during magmatic crystallization. Dissolution of spinel explains the O isotopic composition (16O-excess and extent of mass fractionation) of pyroxene as well as that of primary anorthite/dmisteinbergite and possibly that of the last melilite crystallizing immediately before pyroxene. It requires that igneous CAIs behaved as closed-systems relative to oxygen from nebular gas during a significant fraction of their cooling history, contrary to the common assumption that CAI partial melts constantly equilibrated with gas. The mineralogical control on O isotopes in igneous CAIs is thus simply explained by a single 16O-depletion during magmatic crystallization. This 16O-depletion occurred in an early stage of the thermal history, after the crystallization of spinel, i.e. in the temperature range for melilite crystallization/partial melting and did not require multiple, complex or late isotope exchange. More experimental work is however required to deduce the protoplanetary disk conditions associated with this 16O-depletion.

  9. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Science.gov (United States)

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  10. Gender Role, Gender Identity and Sexual Orientation in CAIS ("XY-Women") Compared With Subfertile and Infertile 46,XX Women.

    Science.gov (United States)

    Brunner, Franziska; Fliegner, Maike; Krupp, Kerstin; Rall, Katharina; Brucker, Sara; Richter-Appelt, Hertha

    2016-01-01

    The perception of gender development of individuals with complete androgen insensitivity syndrome (CAIS) as unambiguously female has recently been challenged in both qualitative data and case reports of male gender identity. The aim of the mixed-method study presented was to examine the self-perception of CAIS individuals regarding different aspects of gender and to identify commonalities and differences in comparison with subfertile and infertile XX-chromosomal women with diagnoses of Mayer-Rokitansky-Küster-Hauser syndrome (MRKHS) and polycystic ovary syndrome (PCOS). The study sample comprised 11 participants with CAIS, 49 with MRKHS, and 55 with PCOS. Gender identity was assessed by means of a multidimensional instrument, which showed significant differences between the CAIS group and the XX-chromosomal women. Other-than-female gender roles and neither-female-nor-male sexes/genders were reported only by individuals with CAIS. The percentage with a not exclusively androphile sexual orientation was unexceptionally high in the CAIS group compared to the prevalence in "normative" women and the clinical groups. The findings support the assumption made by Meyer-Bahlburg ( 2010 ) that gender outcome in people with CAIS is more variable than generally stated. Parents and professionals should thus be open to courses of gender development other than typically female in individuals with CAIS.

  11. Computational analysis of EGFR inhibition by Argos.

    Science.gov (United States)

    Reeves, Gregory T; Kalifa, Rachel; Klein, Daryl E; Lemmon, Mark A; Shvartsman, Stanislav Y

    2005-08-15

    Argos, a secreted inhibitor of the Drosophila epidermal growth factor receptor, and the only known secreted receptor tyrosine kinase inhibitor, acts by sequestering the EGFR ligand Spitz. We use computational modeling to show that this biochemically-determined mechanism of Argos action can explain available genetic data for EGFR/Spitz/Argos interactions in vivo. We find that efficient Spitz sequestration by Argos is key for explaining the existing data and for providing a robust feedback loop that modulates the Spitz gradient in embryonic ventral ectoderm patterning. Computational analysis of the EGFR/Spitz/Argos module in the ventral ectoderm shows that Argos need not be long-ranged to account for genetic data, and can actually have very short range. In our models, Argos with long or short length scale functions to limit the range and action of secreted Spitz. Thus, the spatial range of Argos does not have to be tightly regulated or may act at different ranges in distinct developmental contexts.

  12. Computational analysis of unmanned aerial vehicle (UAV)

    Science.gov (United States)

    Abudarag, Sakhr; Yagoub, Rashid; Elfatih, Hassan; Filipovic, Zoran

    2017-01-01

    A computational analysis has been performed to verify the aerodynamics properties of Unmanned Aerial Vehicle (UAV). The UAV-SUST has been designed and fabricated at the Department of Aeronautical Engineering at Sudan University of Science and Technology in order to meet the specifications required for surveillance and reconnaissance mission. It is classified as a medium range and medium endurance UAV. A commercial CFD solver is used to simulate steady and unsteady aerodynamics characteristics of the entire UAV. In addition to Lift Coefficient (CL), Drag Coefficient (CD), Pitching Moment Coefficient (CM) and Yawing Moment Coefficient (CN), the pressure and velocity contours are illustrated. The aerodynamics parameters are represented a very good agreement with the design consideration at angle of attack ranging from zero to 26 degrees. Moreover, the visualization of the velocity field and static pressure contours is indicated a satisfactory agreement with the proposed design. The turbulence is predicted by enhancing K-ω SST turbulence model within the computational fluid dynamics code.

  13. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  14. Learning mnemonics: a preliminary evaluation of a computer-aided instruction package for the elderly.

    Science.gov (United States)

    Finkel, S I; Yesavage, J A

    1989-01-01

    Sixty-two normal elderly subjects averaging 71 years old were taught a common mnemonic device for recall of lists using a Computer-Aided Instruction (CAI) package. Improvement in list-learning after CAI training was not statistically different from a separate group of 218 elderly subjects who received instruction from a trainer in a normal classroom situation. Improvement in the CAI group was significantly related to higher scores on the Openness to Experience subscale of the NEO-Personality Inventory. CAI devices for memory training in the elderly may find a place in training selected elders on specific areas of memory loss.

  15. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...... have become the standard with which to assess emphysema extent but airway abnormalities have so far been more challenging to quantify. Automated methods for analysis are indispensable as the visible airway tree in a CT scan can include several hundreds of individual branches. However, automation...... the Danish Lung Cancer Screening Trial. This includes methods for extracting airway surfaces from the images and ways of achieving comparable measurements in airway branches through matching and anatomical labelling. The methods were used to study effects of differences in inspiration level at the time...

  16. Analysis of mesenteric thickening on computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Takano, Hideyuki; Sekiya, Tohru; Miyakawa, Kunihisa; Ozaki, Masatoki; Katsuyama, Naofumi; Nakano, Masao (University of the Ryukyu, Okinawa (Japan). School of Medicine)

    1990-12-01

    Computed tomography (CT) provides noninvasive information in the evaluation of abnormalities of the gastrointestinal tract by direct imaging of the bowel wall and adjacent mesentery. Several prior studies have discussed the variable CT appearances of mesenteric abnormalities, such as lymphoma, metastasis, inflammatory disease and edema. Although mesenteric thickening was mentioned in these studies, no study has provided a detailed analysis of the CT appearance of the thickened mesentery. Two characteristic types of mesenteric thickening were identified in 47 patients. Type I is 'intramesenteric thickening', which was noted in 25 patients with vascular obstruction, inflammatory disease and edema. Type II is 'mesenteric surface thickening', which was noted in 22 patients with peritonitis carcinomatosa, peritoneal mesothelioma, tuberculous peritonitis and pseudomyxoma peritoneai. An understanding of these two types of mesenteric diseases is important in the identification of mesenteric pathology. (author).

  17. Analysis of mesenteric thickening on computed tomography

    International Nuclear Information System (INIS)

    Takano, Hideyuki; Sekiya, Tohru; Miyakawa, Kunihisa; Ozaki, Masatoki; Katsuyama, Naofumi; Nakano, Masao

    1990-01-01

    Computed tomography (CT) provides noninvasive information in the evaluation of abnormalities of the gastrointestinal tract by direct imaging of the bowel wall and adjacent mesentery. Several prior studies have discussed the variable CT appearances of mesenteric abnormalities, such as lymphoma, metastasis, inflammatory disease and edema. Although mesenteric thickening was mentioned in these studies, no study has provided a detailed analysis of the CT appearance of the thickened mesentery. Two characteristic types of mesenteric thickening were identified in 47 patients. Type I is 'intramesenteric thickening', which was noted in 25 patients with vascular obstruction, inflammatory disease and edema. Type II is 'mesenteric surface thickening', which was noted in 22 patients with peritonitis carcinomatosa, peritoneal mesothelioma, tuberculous peritonitis and pseudomyxoma peritoneai. An understanding of these two types of mesenteric diseases is important in the identification of mesenteric pathology. (author)

  18. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    for traffic classification, which can be used for nearly real-time processing of big amounts of data using affordable CPU and memory resources. Other questions are related to methods for real-time estimation of the application Quality of Service (QoS) level based on the results obtained by the traffic...... to create realistic traffic profiles of the selected applications, which can server as the training data for MLAs. We assessed the usefulness of C5.0 Machine Learning Algorithm (MLA) in the classification of computer network traffic. We showed that the application-layer payload is not needed to train the C5......Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models...

  19. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    Science.gov (United States)

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  20. CAIS/ACSI 2001: Beyond the Web: Technologies, Knowledge and People.

    Science.gov (United States)

    Canadian Journal of Information and Library Science, 2000

    2000-01-01

    Presents abstracts of papers presented at the 29th Annual Conference of the Canadian Association for Information Science (CAIS) held in Quebec on May 27-29, 2001. Topics include: professional development; librarian/library roles; information technology uses; virtual libraries; information seeking behavior; literacy; information retrieval;…

  1. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    DEFF Research Database (Denmark)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.

    2014-01-01

    We present a detailed characterization of the mineralogy, petrology, and oxygen isotopic compositions of twelve FUN CAIs, including C1 and EK1-4-1 from Allende (CV), that were previously shown to have large isotopic fractionation patterns for magnesium and oxygen, and large isotopic anomalies...... in GG#3)), define mass-dependent fractionation lines with a similar slope of ~0.5. The different inclusions have different δ17O values ranging from ~-25‰ to ~-16‰. Melilite and plagioclase in the CV FUN CAIs have 16O-poor compositions (δ17O ~-3‰) and plot near the intercept of the Allende CAI line...... and gas-melt oxygen-isotope exchange in a 16O-poor gaseous reservoir that resulted in crystallization of 16O-depleted fassaite, melilite and plagioclase. The final oxygen isotopic compositions of melilite and plagioclase in the CV FUN CAIs may have been established on the CV parent asteroid as a result...

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  4. Computational analysis of heat flow in computer casing

    Science.gov (United States)

    Nor Azwadi, C. S.; Goh, C. K.; Afiq Witri, M. Y.

    2012-06-01

    Reliability of a computer system is directly related to thermal management system. This is due to the fact that poor thermal management led to high temperature distribution throughout hardware components and resulting poor performance and reducing fatigue life of the package. Therefore, good cooling solutions (heat sink, fan) and proper form factor design (expandability, interchangeable of parts) is necessary to provide good thermal management in computer system. The performance of Advanced Technology Extended (ATX) and its purposed successor, Balanced Technology Extended (BTX) were compared to investigate the aforementioned factors. Simulations were conducted by using ANSYS software. Results obtained from simulations were compared with values in the datasheet obtained from manufacturers for validation purposes and it was discovered that there are more chaos region in the flow profile for ATX form factor. In contrast, BTX form factor yields a straighter flow profile. Based on the result, we can conclude that BTX form factor has better cooling capability compared to its predecessor, ATX due to the improvement of layout made in the BTX form factor. With this change, it enabled BTX form factor to be used with more advanced components which dissipate more amount of heat and also improves the acoustic performance of BTX by reducing the number of fan needed to just one unit for BTX.

  5. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  6. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  7. Interactive Graphics in CAD/CAI in Chemical Engineering.

    Science.gov (United States)

    Lewin, D. R.

    This paper describes the development of a software program which incorporates interactive graphics techniques into a teaching and research environment at the Department of Chemical Engineering, Technion, Israel, and the experience of transferring the software from mainframe to personal computer (PC) operating systems at the California Institute of…

  8. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  9. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  10. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Science.gov (United States)

    Garrett, T. J.; And Others

    1987-01-01

    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  11. DLI-IBM Joint Feasibility Study in Computer-Assisted Foreign Language Instruction. Final Report.

    Science.gov (United States)

    Adams, Edward N.; Rosenbaum, Peter S.

    This document is the final report on a study of the use of computer assisted instruction (CAI). The objective of the study was to evaluate the potential applicability and usefulness of CAI in the instructional environment of the Defense Language Institute (DLI). The operational phases of the study were implemented in the Russian Aural…

  12. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  13. Computer-Assisted Instruction: A Case Study of Two Charter Schools

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2013-01-01

    The purpose of this study was to examine the relationship in achievement gap between English language learners (ELLs) utilizing computer-assisted instruction (CAI) in the classroom, and ELLs relying solely on traditional classroom instruction. The study findings showed that students using CAI to supplement traditional lectures performed better…

  14. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    2008-06-01

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  15. Computational Functional Analysis of Lipid Metabolic Enzymes.

    Science.gov (United States)

    Bagnato, Carolina; Have, Arjen Ten; Prados, María B; Beligni, María V

    2017-01-01

    The computational analysis of enzymes that participate in lipid metabolism has both common and unique challenges when compared to the whole protein universe. Some of the hurdles that interfere with the functional annotation of lipid metabolic enzymes that are common to other pathways include the definition of proper starting datasets, the construction of reliable multiple sequence alignments, the definition of appropriate evolutionary models, and the reconstruction of phylogenetic trees with high statistical support, particularly for large datasets. Most enzymes that take part in lipid metabolism belong to complex superfamilies with many members that are not involved in lipid metabolism. In addition, some enzymes that do not have sequence similarity catalyze similar or even identical reactions. Some of the challenges that, albeit not unique, are more specific to lipid metabolism refer to the high compartmentalization of the routes, the catalysis in hydrophobic environments and, related to this, the function near or in biological membranes.In this work, we provide guidelines intended to assist in the proper functional annotation of lipid metabolic enzymes, based on previous experiences related to the phospholipase D superfamily and the annotation of the triglyceride synthesis pathway in algae. We describe a pipeline that starts with the definition of an initial set of sequences to be used in similarity-based searches and ends in the reconstruction of phylogenies. We also mention the main issues that have to be taken into consideration when using tools to analyze subcellular localization, hydrophobicity patterns, or presence of transmembrane domains in lipid metabolic enzymes.

  16. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  17. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  18. Recursive Harmonic Analysis for Computational of Hansen Coefficients

    OpenAIRE

    Sharaf, Mohamed Adel; Selim, Hadia Hassan

    2010-01-01

    This paper reports on a simple pure numerical method developed for computing Hansen coefficients by using recursive harmonic analysis technique. The precision criteria of the computations are very satisfactory and provide materials for computing Hansen's and Hansen's like expansions, also to check the accuracy of some existing algorithms.

  19. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  20. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  1. Multiple Nebular Gas Reservoirs Recorded by Oxygen Isotope Variation in a Spinel-rich CAI in CO3 MIL 090019

    Science.gov (United States)

    Simon, J. I.; Simon, S. B.; Nguyen, A. N.; Ross, D. K.; Messenger, S.

    2017-01-01

    We conducted NanoSIMS O-isotopic imaging of a primitive spinel-rich CAI spherule (27-2) from the MIL 090019 CO3 chondrite. Inclusions such as 27-2 are proposed to record inner nebula processes during an epoch of rapid solar nebula evolution. Mineralogical and textural analyses suggest that this CAI formed by high temperature reactions, partial melting, and condensation. This CAI exhibits radial O-isotopic heterogeneity among multiple occurrences of the same mineral, reflecting interactions with distinct nebular O-isotopic reservoirs.

  2. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  3. ANACROM - A computer code for chromatogram analysis

    International Nuclear Information System (INIS)

    Gouveia, A.S. de; Mesquita, C.H. de.

    1981-01-01

    The computer code was developed for automatic research of peaks and evaluation of chromatogram parameters as : center, height, area, medium - height width (FWHM) and the rate FWHM/center of each peak. (Author) [pt

  4. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  5. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  6. Analysis On Security Of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Muhammad Zunnurain Hussain

    2017-01-01

    Full Text Available In this paper Author will be discussing the security issues and challenges faced by the industry in securing the cloud computing and how these problems can be tackled. Cloud computing is modern technique of sharing resources like data sharing file sharing basically sharing of resources without launching own infrastructure and using some third party resources to avoid huge investment . It is very challenging these days to secure the communication between two users although people use different encryption techniques 1.

  7. Granular computing analysis and design of intelligent systems

    CERN Document Server

    Pedrycz, Witold

    2013-01-01

    Information granules, as encountered in natural language, are implicit in nature. To make them fully operational so they can be effectively used to analyze and design intelligent systems, information granules need to be made explicit. An emerging discipline, granular computing focuses on formalizing information granules and unifying them to create a coherent methodological and developmental environment for intelligent system design and analysis. Granular Computing: Analysis and Design of Intelligent Systems presents the unified principles of granular computing along with its comprehensive algo

  8. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  9. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu

    1989-11-01

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  10. Computer-Assisted Statistical Analysis: Mainframe or Microcomputer.

    Science.gov (United States)

    Shannon, David M.

    1993-01-01

    Describes a study that was designed to examine whether the computer attitudes of graduate students in a beginning statistics course differed based on their prior computer experience and the type of statistical analysis package used. Versions of statistical analysis packages using a mainframe and a microcomputer are compared. (14 references) (LRW)

  11. The Analysis of Some Contemporary Computer Mikrosystems

    Directory of Open Access Journals (Sweden)

    Angelė Kaulakienė

    2011-04-01

    Full Text Available In every language a twofold process could be observed: 1 a huge surge of new terms and 2 a big part of these new terms make their way into the common language. The nucleus of the vocabulary and the grammatical system of the common language make the essence of a language and its national originality. Because of such an intensive development in the future terminological lexis can become a basis of a common language and it ought to be not a spontaneously formed sum of terminological lexis, but an entirety of consciously created terms, which meet the requirements of language, logic and terminology. Computer terminology, by comparison with terminology of other fields, is being created in a slightly unusual way. The first computation institutions in Lithuania were established in early sixties and a decade later there were a few computation centres and a number of key-operated and punch machines working. Together with the new computational technology many new devices, units, parts, phenomena and characteristics appeared, which needed naming. Specialists faced an obvious shortage of Lithuanian terms for computing equipment. In 1971 this gap was partly filled by „Rusų-lietuvių-anglų kalbų skaičiavimo technikos žodynas“ (Russian-Lithuanian-English dictionary of computing equipment, which for a long time (for more than 20 years was the only one terminological dictionary of this field. Only during nineties a few dictionaries of different scope appeared. Computer terminology from ten dictionaries, which are presently available, shows that 35 year period of computer terminology is a stage of its creation, the main features of which are reasonable synonymy (when both international term are being used to name the concept and variability. Such state of Lithuanian computer terminology is predetermined by some linguistic, interlinguistic and sociolinguistic factors. At present in Lithuania terminological dictionaries of various fields are being given to

  12. Numerical investigation of CAI Combustion in the Opposed- Piston Engine with Direct and Indirect Water Injection

    Science.gov (United States)

    Pyszczek, R.; Mazuro, P.; Teodorczyk, A.

    2016-09-01

    This paper is focused on the CAI combustion control in a turbocharged 2-stroke Opposed-Piston (OP) engine. The barrel type OP engine arrangement is of particular interest for the authors because of its robust design, high mechanical efficiency and relatively easy incorporation of a Variable Compression Ratio (VCR). The other advantage of such design is that combustion chamber is formed between two moving pistons - there is no additional cylinder head to be cooled which directly results in an increased thermal efficiency. Furthermore, engine operation in a Controlled Auto-Ignition (CAI) mode at high compression ratios (CR) raises a possibility of reaching even higher efficiencies and very low emissions. In order to control CAI combustion such measures as VCR and water injection were considered for indirect ignition timing control. Numerical simulations of the scavenging and combustion processes were performed with the 3D CFD multipurpose AVL Fire solver. Numerous cases were calculated with different engine compression ratios and different amounts of directly and indirectly injected water. The influence of the VCR and water injection on the ignition timing and engine performance was determined and their application in the real engine was discussed.

  13. Computational Analysis of SAXS Data Acquisition.

    Science.gov (United States)

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.

  14. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  15. Computed Tomography Analysis of NASA BSTRA Balls

    Energy Technology Data Exchange (ETDEWEB)

    Perry, R L; Schneberk, D J; Thompson, R R

    2004-10-12

    Fifteen 1.25 inch BSTRA balls were scanned with the high energy computed tomography system at LLNL. This system has a resolution limit of approximately 210 microns. A threshold of 238 microns (two voxels) was used, and no anomalies at or greater than this were observed.

  16. Cloud Computing for Rigorous Coupled-Wave Analysis

    Directory of Open Access Journals (Sweden)

    N. L. Kazanskiy

    2012-01-01

    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  17. Computer analysis of regional cerebral perfusion: a critical clinical analysis

    International Nuclear Information System (INIS)

    Ronai, P.M.; O'Reilly, R.J.; Collins, P.J.

    1975-01-01

    An advanced computer program has been developed for the analysis of regional cerebral perfusion data obtained by the scintillation camera during sup(99m)Tc-pertechnetate cerebral radioangiography. The program evaluates and corrects detector non-uniformity; selects (using statistical criteria) three regions of interest (ROIs) corresponding to the distribution of the left and right middle cerebral arteries and both anterior cerebral arteries; measures the areas of the ROIs and plots total corrected counts per unit ROI area per unit time against time for each ROI. The significance of differences in perfusion between ROIs is assessed and related to a normal population distribution. A critical clinical evaluation of this program was undertaken on 686 patients who had cerebral radioangiograms during 1970 and 1971. Final diagnoses were determined from case records after a follow-up period of two years or more. Original digital data were reprocessed and original analogue dynamic scintiphotos were re-reported without knowledge of the clinical data or digital results. Analogue reports were compared with the results of digital analysis using two different statistical methods to analyse the digital results. Neither method of digital analysis gave any increase in the yield of true positive findings while one method of analysis substantially increased the yield of false-positive findings. Comparison of these findings with those of a similar study carried out in 1971 shows a dramatic improvement in the accuracy of analogue results in the present study compared with the earlier one, but no change in the accuracy of digital results. Observer ''education'' by continued exposure to digital data at routine reporting sessions in the intervening three years between the two studies is suggested as the main cause of the improvement in analogue results. Attention is also drawn to the importance of good detector uniformity characteristics for accurate analogue reporting and to the fact that

  18. Rare Earth Element Measurements of Melilite and Fassaite in Allende Cai by Nanosims

    Science.gov (United States)

    Ito, M.; Messenger, Scott

    2009-01-01

    The rare earth elements (REEs) are concentrated in CAIs by approx. 20 times the chondritic average [e.g., 1]. The REEs in CAIs are important to understand processes of CAI formation including the role of volatilization, condensation, and fractional crystallization [1,2]. REE measurements are a well established application of ion microprobes [e.g., 3]. However the spatial resolution of REE measurements by ion microprobe (approx.20 m) is not adequate to resolve heterogeneous distributions of REEs among/within minerals. We have developed methods for measuring REE with the NanoSIMS 50L at smaller spatial scales. Here we present our initial measurements of REEs in melilite and fassaite in an Allende Type-A CAI with the JSC NanoSIMS 50L. We found that the key parameters for accurate REE abundance measurements differ between the NanoSIMS and conventional SIMS, in particular the oxide-to-element ratios, the relative sensitivity factors, the energy distributions, and requisite energy offset. Our REE abundance measurements of the 100 ppm REE diopside glass standards yielded good reproducibility and accuracy, 0.5-2.5 % and 5-25 %, respectively. We determined abundances and spatial distributions of REEs in core and rim within single crystals of fassaite, and adjacent melilite with 5-10 m spatial resolution. The REE abundances in fassaite core and rim are 20-100 times CI abundance but show a large negative Eu anomaly, exhibiting a well-defined Group III pattern. This is consistent with previous work [4]. On the other hand, adjacent melilite shows modified Group II pattern with no strong depletions of Eu and Yb, and no Tm positive anomaly. REE abundances (2-10 x CI) were lower than that of fassaite. These patterns suggest that fassaite crystallized first followed by a crystallization of melilite from the residual melt. In future work, we will carry out a correlated study of O and Mg isotopes and REEs of the CAI in order to better understand the nature and timescales of its

  19. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  20. Computational analysis of ozonation in bubble columns

    International Nuclear Information System (INIS)

    Quinones-Bolanos, E.; Zhou, H.; Otten, L.

    2002-01-01

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  1. Computer aided radiation analysis for manned spacecraft

    Science.gov (United States)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  2. Advances in Computer-Based Autoantibodies Analysis

    Science.gov (United States)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  3. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  4. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  5. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  6. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  7. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  8. Rancangan Perangkat Lunak Computer Assisted Instruction (CAI) Untuk Ilmu Tajwid Berbasis Web

    OpenAIRE

    Fenny Purwani

    2015-01-01

    The development of information technology and science refer to the need of teching-learning concept and mechanism wich are based on information technology, undoubtedly. Regarding the development, it needs qualified human resources and flexible material changing and it should be appropriate with technology and science development. Additionaly, this combines between education based on religious and techology (IMTAK and IPTEK). Internet techology can be used as teaching tool which is known as Co...

  9. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  10. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  11. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    This paper described the mathematical basis and computational framework of a computer program developed for short circuit studies of electric power systems. The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric ...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  13. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  14. OXYGEN ISOTOPIC COMPOSITIONS OF THE ALLENDE TYPE C CAIs: EVIDENCE FOR ISOTOPIC EXCHANGE DURING NEBULAR MELTING AND ASTEROIDAL THERMAL METAMORPHISM

    Energy Technology Data Exchange (ETDEWEB)

    Krot, A N; Chaussidon, M; Yurimoto, H; Sakamoto, N; Nagashima, K; Hutcheon, I D; MacPherson, G J

    2008-02-21

    Based on the mineralogy and petrography, coarse-grained, igneous, anorthite-rich (Type C) calcium-aluminum-rich inclusions (CAIs) in the CV3 carbonaceous chondrite Allende have been recently divided into three groups: (i) CAIs with melilite and Al,Ti-diopside of massive and lacy textures (coarse grains with numerous rounded inclusions of anorthite) in a fine-grained anorthite groundmass (6-1-72, 100, 160), (ii) CAI CG5 with massive melilite, Al,Ti-diopside and anorthite, and (iii) CAIs associated with chondrule material: either containing chondrule fragments in their peripheries (ABC, TS26) or surrounded by chondrule-like, igneous rims (93) (Krot et al., 2007a,b). Here, we report in situ oxygen isotopic measurements of primary (melilite, spinel, Al,Ti-diopside, anorthite) and secondary (grossular, monticellite, forsterite) minerals in these CAIs. Spinel ({Delta}{sup 17}O = -25{per_thousand} to -20{per_thousand}), massive and lacy Al,Ti-diopside ({Delta}{sup 17}O = -20{per_thousand} to -5{per_thousand}) and fine-grained anorthite ({Delta}{sup 17}O = -15{per_thousand} to -2{per_thousand}) in 100, 160 and 6-1-72 are {sup 16}O-enriched relative spinel and coarse-grained Al,Ti-diopside and anorthite in ABC, 93 and TS26 ({Delta}{sup 17}O ranges from -20{per_thousand} to -15{per_thousand}, from -15{per_thousand} to -5{per_thousand}, and from -5{per_thousand} to 0{per_thousand}, respectively). In 6-1-72, massive and lacy Al,Ti-diopside grains are {sup 16}O-depleted ({Delta}{sup 17}O {approx} -13{per_thousand}) relative to spinel ({Delta}{sup 17}O = -23{per_thousand}). Melilite is the most {sup 16}O-depleted mineral in all Allende Type C CAIs. In CAI 100, melilite and secondary grossular, monticellite and forsterite (minerals replacing melilite) are similarly {sup 16}O-depleted, whereas grossular in CAI 160 is {sup 16}O-enriched ({Delta}{sup 17}O = -10{per_thousand} to -6{per_thousand}) relative to melilite ({Delta}{sup 17}O = -5{per_thousand} to -3{per_thousand}). We infer

  15. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  17. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  18. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  19. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  20. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  1. PALSE: Python Analysis of Large Scale (Computer) Experiments

    OpenAIRE

    Cazals, Frédéric; Dreyfus, Tom; Malod-Dognin, Noël; Lhéritier, Alix

    2012-01-01

    A tenet of Science is the ability to reproduce the results, and a related issue is the possibility to archive and interpret the raw results of (computer) experiments. This paper presents an elementary python framework addressing this latter goal. Consider a computing pipeline consisting of raw data generation, raw data parsing, and data analysis i.e. graphical and statistical analysis. palse addresses these last two steps by leveraging the hierarchical structure of XML documents. More precise...

  2. Computer aided information system for a PWR

    International Nuclear Information System (INIS)

    Vaidian, T.A.; Karmakar, G.; Rajagopal, R.; Shankar, V.; Patil, R.K.

    1994-01-01

    The computer aided information system (CAIS) is designed with a view to improve the performance of the operator. CAIS assists the plant operator in an advisory and support role, thereby reducing the workload level and potential human errors. The CAIS as explained here has been designed for a PWR type KLT- 40 used in Floating Nuclear Power Stations (FNPS). However the underlying philosophy evolved in designing the CAIS can be suitably adopted for other type of nuclear power plants too (BWR, PHWR). Operator information is divided into three broad categories: a) continuously available information b) automatically available information and c) on demand information. Two in number touch screens are provided on the main control panel. One is earmarked for continuously available information and the other is dedicated for automatically available information. Both the screens can be used at the operator's discretion for on-demand information. Automatically available information screen overrides the on-demand information screens. In addition to the above, CAIS has the features of event sequence recording, disturbance recording and information documentation. CAIS design ensures that the operator is not overburdened with excess and unnecessary information, but at the same time adequate and well formatted information is available. (author). 5 refs., 4 figs

  3. Computer-assisted treatment planning and analysis.

    Science.gov (United States)

    Beers, A C; Choi, W; Pavlovskaia, E

    2003-01-01

    The Invisalign orthodontic system (Align Technology, Inc, Santa Clara, CA) is a series of clear removable appliances that is worn by a patient to correct malocclusions. Introduced in 1999, it has been applied to successfully correct an increasingly wide variety of malocclusions. Part of the success of the system is because of the innovative technologies inherent in the design of the appliances. During the development of the system, many challenges and issues needed to be overcome to realize the product. Many of these issues were not specific to Invisalign, and represented general problems in the area of computer-aided orthodontics. The general problems of developing a virtual model of a patient's dentition appropriate for use in orthodontics, performing a treatment plan on a virtual dentition model, and analyzing how accurately the virtual treatment plan executed in the patient's mouth are presented.

  4. Retrospective analysis of computed radiography exposure reporting.

    Science.gov (United States)

    David, George; Redden, Amanda E

    2011-01-01

    The increasing public spotlight on medical imaging overuse and radiation overexposure has led to a greater demand for radiation dose monitoring. Computed radiography (CR) exposure reporting databases allow radiographers to monitor dose creep, which can help decrease overall patient radiation exposure from medical examinations. Import exposure data from CR readers into a database that allows administrators to monitor and analyze dose data for quality assurance. A CR exposure reporting database and statistics website was created to analyze CR reader dose data and track dose creep. Radiography departments can effectively monitor dose creep using a CR exposure reporting database to retrieve and analyze dose data, as well as identify dose trends, workloads, radiographer performance, and need for further training on proper technique. This knowledge can help decrease overall patient radiation exposure from medical examinations. ©2011 by the American Society of Radiologic Technologists.

  5. Studies on computer analysis for radioisotope images

    International Nuclear Information System (INIS)

    Takizawa, Masaomi

    1977-01-01

    A hybrid type image file and processing system are devised by the author to file and the radioisotope image processing with analog display. This system has some functions as follows; Ten thousand images can be stored in 60 feets length video-tape-recorder (VTR) tape. Maximum of an image on the VTR tape is within 15 sec. An image display enabled by the analog memory, which has brightness more than 15 gray levels. By using the analog memories, effective image processing can be done by the small computer. Many signal sources can be inputted into the hybrid system. This system can be applied many fields to both routine works and multi-purpose radioisotope image processing. (auth.)

  6. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  7. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  8. Computer-Assisted Learning Design for Reflective Practice Supporting Multiple Learning Styles for Education and Training in Pre-Hospital Emergency Care.

    Science.gov (United States)

    Jones, Indra; Cookson, John

    2001-01-01

    Students in paramedic education used a model combining computer-assisted instruction (CAI), reflective practice, and learning styles. Although reflective practice normally requires teacher-student interaction, CAI with reflective practice embedded enabled students to develop learning style competencies and achieve curricular outcomes. (SK)

  9. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-01-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  10. On native Danish learners' challenges in distinguishing /tai/, /cai/ and /zai/

    DEFF Research Database (Denmark)

    Sloos, Marjoleine; Zhang, Chun

    2015-01-01

    Chinese (L2) initial consonants, namely , phonologically /th ts tsh/. Impressionistically, disentangling tai-cai-zai is extremely challenging for Danish learners, but experimental confirmation is lacking (Wang, Sloos & Zhang 2015, forthcoming). Eighteen native Danish learners of Chinese of Aarhus......With a growing interest in learning Chinese globally, there is a growing interest for phonologists and language instructors to understand how nonnative Chinese learners perceive the Chinese sound inventory. We experimentally investigated the Danish (L 1) speaker’s perception of three Mandarin...

  11. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  12. A Computable General Equilibrium Microsimulation Analysis

    African Journals Online (AJOL)

    Daniel

    3 The UNDP (2010) introduced MPI for the first time to complement money-based measures by considering multiple .... assumptions of neoclassical theory, lack of sensitivity analysis, validity of predictions for policy etc. ..... tax rates are exogenous and it is the changes in government savings that equilibrate the economy.

  13. Computed tomographic analysis of urinary calculi

    International Nuclear Information System (INIS)

    Naito, Akira; Ito, Katsuhide; Ito, Shouko

    1986-01-01

    Computed tomography (CT) was employed in an effort to analyze the chemical composition of urinary calculi. Twenty-three surgically removed calculi were scanned in a water bath (in vitro study). Forteen of them in the body were scanned (in vivo study). The calculi consisted of four types: mixed calcium oxalate and phosphate, mixed calcium carbonate and phosphate, magnesium ammonium phosphate, and uric acid. The in vitro study showed that the mean and maximum CT values of uric acid stones were significantly lower than those of the other three types of stones. This indicated that stones with less than 450 HU are composed of uric acid. In an in vivo study, CT did not help to differentiate the three types of urinary calculi, except for uric acid stones. Regarding the mean CT values, there was no correlation between in vitro and in vivo studies. An experiment with commercially available drugs showed that CT values of urinary calculi were not dependent upon the composition, but dependent upon the density of the calculi. (Namekawa, K.)

  14. Local spatial frequency analysis for computer vision

    Science.gov (United States)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  15. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  18. An Interactive Computational Aerodynamics Analysis Program.

    Science.gov (United States)

    1980-12-01

    Continuous Control System Analysis and Synthesis ," AFIT Master’s Thesis. Wright-Patterson AFB, Ohio: Air Force Institute of Technology, 1978. 10. Sasman... asymetric source distribution: "The Kutta condition is applied at the edge of the upper and lower surface boundary layers at the trailing edge to...solution is modified as a function of the circulation only. The displacement effect of the wake and boundary layer are represented by an asymetric

  19. Cafts: computer aided fault tree analysis

    International Nuclear Information System (INIS)

    Poucet, A.

    1985-01-01

    The fault tree technique has become a standard tool for the analysis of safety and reliability of complex system. In spite of the costs, which may be high for a complete and detailed analysis of a complex plant, the fault tree technique is popular and its benefits are fully recognized. Due to this applications of these codes have mostly been restricted to simple academic examples and rarely concern complex, real world systems. In this paper an interactive approach to fault tree construction is presented. The aim is not to replace the analyst, but to offer him an intelligent tool which can assist him in modeling complex systems. Using the CAFTS-method, the analyst interactively constructs a fault tree in two phases: (1) In a first phase he generates an overall failure logic structure of the system; the macrofault tree. In this phase, CAFTS features an expert system approach to assist the analyst. It makes use of a knowledge base containing generic rules on the behavior of subsystems and components; (2) In a second phase the macrofault tree is further refined and transformed in a fully detailed and quantified fault tree. In this phase a library of plant-specific component failure models is used

  20. Computer assisted analysis of 2-DG autoradiographs.

    Science.gov (United States)

    Gallistel, C R; Piner, C T; Allen, T O; Adler, N T; Yadin, E; Negin, M

    1982-01-01

    A computerized image processing system is described that assists the neurobiologists in analyzing data from 2-DG autoradiography by providing for: (1) Rapid fine-scale digitization of gray levels using a TV camera (2) The recognition of and verification of subtle differences in optical density with the aid of color windows (3) the superimposition of the autoradiographic image upon the histological image, so that the activity seen in the autoradiograph can be accurately assigned to anatomically defined structures (4) The production of numerical data suitable for statistical analysis and line drawings suitable for black on white publication (5) The relating of local gray level to a norm for the image as a whole, so as to remove the variability introduced by variations in section thickness, in the amount of 2-DG seen by the brain during incorporation, in level of anesthesia, etc. If the localized darkening in autoradiographic images is being used as an index of localized functional activity rather than as a measure of metabolism, normalization obviates the need to obtain arterial blood samples. These routines permit anatomically accurate numerical analysis of autoradiographs without any constraints on the experimental situation.

  1. Caiçaras, caboclos and natural resources: rules and scale patterns Caiçaras, caboclos e recursos naturais: regras e padrões de escala

    Directory of Open Access Journals (Sweden)

    Alpina Begossi

    1999-12-01

    Full Text Available One important question concerning the sustainability of local or native populations refers to their interaction with local and global institutions. We should expect that populations with capacity to interact economically and politically with institutions, might show a better chance for their ecological and cultural continuity, as well as for their system of trade and subsistence. The level of ecological and social interaction of local populations, following concepts from ecology, occurs on different scales: for example, from the territories of individual fishermen on the Atlantic Forest coast to organizations of community Extractive Reserves in the Amazon. The scale of organization (individual/family/community may influence the capacity to deal with institutions. This study analyses how Brazilian native populations, especially caiçaras of the Atlantic Forest coast, and caboclos from the Amazon, have interacted with regional, national and global institutions, concerning environmental demands. Concepts such as common management, natural capital, resilience and sustainability are useful to understand these illustrative cases.Uma questão importante da sustentabilidade de populações locais ou nativas se refere à interação com as instituições locais e globais. Podemos esperar que populações que demonstrem capacidade de interagir de forma econômica e política com as instituições apresentem também uma chance maior de continuidade cultural e ecológica, assim como de seus sistemas de troca e subsistência. O nível da interação ecológica e social das populações locais, seguindo conceitos da ecologia, ocorrem sob escalas diferentes: por exemplo, dos territórios individuais de pescadores da Mata Atlântica às organizações de comunidades em Reservas Extrativistas, na Amazônia. A escala organizacional (individual/familiar/comunitária pode influenciar a capacidade de lidar com as instituições.Esse estudo analisa como popula

  2. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  3. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  4. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  5. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  6. Two Computer Programs for Factor Analysis. Technical Note Number 41.

    Science.gov (United States)

    Wisler, Carl E.

    Two factor analysis algorithms, previously described by P. Horst, have been programed for use on the General Electric Time-Sharing Computer System. The first of these, Principal Components Analysis (PCA), uses the Basic Structure Successive Factor Method With Residual Matrices algorithm to obtain the principal component vectors of a correlation…

  7. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1996-01-01

    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  8. Shape analysis of neutron transmission resonances - a computer code

    International Nuclear Information System (INIS)

    Giacobbe, P.; Magnani, M.

    1979-01-01

    A computer programme for shape-analysis of time-of-flight neutron transmission spectra of non-fissile nuclei in the resonance region is described. The main features are: partial independence of the structure of the programme for the formalism used, accurate description of the resolution function, use of ''a priori'' information on the fitted parameters and simultaneous analysis of many spectra

  9. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  10. What will your time series analysis computer package do: computer packages

    Energy Technology Data Exchange (ETDEWEB)

    Pack, D J

    1978-01-01

    A survey of existing time series analysis computer packages is presented. The primary emphasis is on packages relating to the ARIMA model building process and packages that are non-spectral. The survey is segmented into general purpose standard statistical packages (BMDP, IMSL, SAS, SPSS), many method packages, and integrated ARIMA model building packages. Likely new directions are discussed. A number of concerns arise as a result of the survey, relating to the absence of time series capabilities in the standard statistical packages, the definition of an optimal method, the supply of computer packages from the academic to the business environment, and the degree of computerization appropriate for time series analysis.

  11. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  12. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  13. A Braça da Rede, uma Técnica Caiçara de Medir

    Directory of Open Access Journals (Sweden)

    Gilberto Chieus Jr.

    2009-08-01

    Full Text Available Este artigo relata como os caiçaras da cidade de Ubatuba litoral norte paulista medem suas redes de pesca.Mas antes de estar analisando sua técnica de medir estaremos fazendo uma pequena abordagem da cultura caiçara e suas transformações. Em seguida mostraremos alguns momentos históricos da construção do metro. Depois como os caiçaras medem suas redes e o problema ocorrido no Brasil na implantação do sistema métrico decimal e a resistência de determinadas civilizações que se utiliza de outros padrões para realizar suas medidas, ignorando o atual sistema métrico, devidos o seu contexto cultural. Toda esta discussão está enfocada numa perspectiva histórica da Etnomatemática.

  14. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  15. Computer programs for analysis of geophysical data

    International Nuclear Information System (INIS)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  16. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  17. [Possibilities and limits of computer-assisted cardiotocogram analysis].

    Science.gov (United States)

    Lösche, P

    1997-01-01

    The interpretation of cardiotocograms still relies primarily on visual analysis. This form of monitoring remains labour intensive and, being dependent on the training and experience of the specialist responsible, also subject to erroneous interpretation. Computer-aided cardiotocogram analysis has, in spite of encouraging successes, still not found wide application in everyday clinical routine. To achieve this, the programming system must be easy to operate, userfriendly and reliable. A program system for fully automatic cardiotocogram analysis is envisioned which runs on standard commercially-available personal computers. A clear graphic representation of the traces also permits visual assessment on the computer screen. The system described integrates the main assessment criteria of cardiotocogram analysis which can then be extended owing to the open system architecture used in the programming. Completely new analysis algorithms have given the evaluating system the capability of fully-automatic pattern recognition of fetal heart rate signals and uterine motility. An essential requirement of computer-aided cardiotocogram analysis is thereby fulfilled. Work is now focusing on the exact classification of the various types of deceleration and an extension of the capabilities of tocogram analysis. There should be nothing to hinder integration of the system into everyday clinical routine and connect it to obstetrical databases.

  18. Using the Computer to Improve Basic Skills.

    Science.gov (United States)

    Bozeman, William; Hierstein, William J.

    These presentations offer information on the benefits of using computer-assisted instruction (CAI) for remedial education. First, William J. Hierstein offers a summary of the Computer Assisted Basic Skills Project conducted by Southeastern Community College at the Iowa State Penitentiary. Hierstein provides background on the funding for the…

  19. Computer-Assisted Instruction: Decision Handbook.

    Science.gov (United States)

    1985-04-01

    I0l6. (e~erti >A :::::::::::::::::::::::: 03 6. Inteligent (GeneratA .. CAI.............................. 105... emotional reactions. The role of managers, computer professionals, system users, and social Institutions In promoting positive experiences and attitudes is...Groups: Computer-Based Training; Education of the Deaf; Elementary/Secondary/Junior College; Health; Implementation; Minicomputer Users; Music

  20. Ca-Fe and Alkali-Halide Alteration of an Allende Type B CAI: Aqueous Alteration in Nebular or Asteroidal Settings

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Simon, S. B.; Grossman, L.

    2012-01-01

    Ca-Fe and alkali-halide alteration of CAIs is often attributed to aqueous alteration by fluids circulating on asteroidal parent bodies after the various chondritic components have been assembled, although debate continues about the roles of asteroidal vs. nebular modification processes [1-7]. Here we report de-tailed observations of alteration products in a large Type B2 CAI, TS4 from Allende, one of the oxidized subgroup of CV3s, and propose a speculative model for aqueous alteration of CAIs in a nebular setting. Ca-Fe alteration in this CAI consists predominantly of end-member hedenbergite, end-member andradite, and compositionally variable, magnesian high-Ca pyroxene. These phases are strongly concentrated in an unusual "nodule" enclosed within the interior of the CAI (Fig. 1). The Ca, Fe-rich nodule superficially resembles a clast that pre-dated and was engulfed by the CAI, but closer inspection shows that relic spinel grains are enclosed in the nodule, and corroded CAI primary phases interfinger with the Fe-rich phases at the nodule s margins. This CAI also contains abundant sodalite and nepheline (alkali-halide) alteration that occurs around the rims of the CAI, but also penetrates more deeply into the CAI. The two types of alteration (Ca-Fe and alkali-halide) are adjacent, and very fine-grained Fe-rich phases are associated with sodalite-rich regions. Both types of alteration appear to be replacive; if that is true, it would require substantial introduction of Fe, and transport of elements (Ti, Al and Mg) out of the nodule, and introduction of Na and Cl into alkali-halide rich zones. Parts of the CAI have been extensively metasomatized.

  1. Parallel computation of seismic analysis of high arch dam

    Science.gov (United States)

    Chen, Houqun; Ma, Huaifa; Tu, Jin; Cheng, Guangqing; Tang, Juzhen

    2008-03-01

    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.

  2. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  3. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  4. Calcium and Titanium Isotope Fractionation in CAIS: Tracers of Condensation and Inheritance in the Early Solar Protoplanetary Disk

    Science.gov (United States)

    Simon, J. I.; Jordan, M. K.; Tappa, M. J.; Kohl, I. E.; Young, E. D.

    2016-01-01

    The chemical and isotopic compositions of calcium-aluminum-rich inclusions (CAIs) can be used to understand the conditions present in the protoplantary disk where they formed. The isotopic compositions of these early-formed nebular materials are largely controlled by chemical volatility. The isotopic effects of evaporation/sublimation, which are well explained by both theory and experimental work, lead to enrichments of the heavy isotopes that are often exhibited by the moderately refractory elements Mg and Si. Less well understood are the isotopic effects of condensation, which limits our ability to determine whether a CAI is a primary condensate and/or retains any evidence of its primordial formation history.

  5. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    Beichel, R.R.; Sonka, M.

    2006-01-01

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  6. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  8. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  10. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  11. The effects of computer-assisted instruction on the mathematics performance and classroom behavior of children with ADHD.

    Science.gov (United States)

    Mautone, Jennifer A; DuPaul, George J; Jitendra, Asha K

    2005-08-01

    The present study examines the effects of computer-assisted instruction (CAI) on the mathematics performance and classroom behavior of three second-through fourth-grade students with ADHD. A controlled case study is used to evaluate the effects of the computer software on participants' mathematics performance and on-task behavior. Participants' mathematics achievement improve and their on-task behavior increase during the CAI sessions relative to independent seatwork conditions. In addition, students and teachers consider CAI to be an acceptable intervention for some students with ADHD who are having difficulty with mathematics. Implications of these results for practice and research are discussed.

  12. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  16. De novo structural modeling and computational sequence analysis ...

    African Journals Online (AJOL)

    Jane

    2011-07-25

    Jul 25, 2011 ... Our study was aimed towards computational proteomic analysis and 3D structural modeling of this novel bacteriocin protein encoded by the earlier aforementioned gene. Different bioinformatics tools and machine learning techniques were used for protein structural classification. De novo protein modeling ...

  17. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  18. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  19. Computational Analysis and Mapping of ijCSCL Content

    Science.gov (United States)

    Lonchamp, Jacques

    2012-01-01

    The purpose of this empirical study is to analyze and map the content of the "International Journal of Computer-Supported Collaborative Learning" since its inception in 2006. Co-word analysis is the general approach that is used. In this approach, patterns of co-occurrence of pairs of items (words or phrases) identify relationships among ideas.…

  20. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  1. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  2. Forest Fire History... A Computer Method of Data Analysis

    Science.gov (United States)

    Romain M. Meese

    1973-01-01

    A series of computer programs is available to extract information from the individual Fire Reports (U.S. Forest Service Form 5100-29). The programs use a statistical technique to fit a continuous distribution to a set of sampled data. The goodness-of-fit program is applicable to data other than the fire history. Data summaries illustrate analysis of fire occurrence,...

  3. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  4. Two years since SSAMS: Status of {sup 14}C AMS at CAIS

    Energy Technology Data Exchange (ETDEWEB)

    Ravi Prasad, G.V.; Cherkinsky, Alexander; Culp, Randy A.; Dvoracek, Doug K.

    2015-10-15

    The NEC 250 kV single stage AMS accelerator (SSAMS) was installed two years ago at the Center for Applied Isotope Studies (CAIS), University of Georgia. The accelerator is primarily being used for radiocarbon measurements to test the authenticity of natural and bio-based samples while all other samples such as geological, atmospheric, marine and archaeological. are run on the 500 kV, NEC 1.5SDH-1 model tandem accelerator, which has been operating since 2001. The data obtained over a six months period for OXI, OXII, ANU sucrose and FIRI-D are discussed. The mean value of ANU sucrose observed to be slightly lower than the consensus value. The processed blanks on SSAMS produce lower apparent age compared to the tandem accelerator as expected.

  5. Two years since SSAMS: Status of 14C AMS at CAIS

    Science.gov (United States)

    Ravi Prasad, G. V.; Cherkinsky, Alexander; Culp, Randy A.; Dvoracek, Doug K.

    2015-10-01

    The NEC 250 kV single stage AMS accelerator (SSAMS) was installed two years ago at the Center for Applied Isotope Studies (CAIS), University of Georgia. The accelerator is primarily being used for radiocarbon measurements to test the authenticity of natural and bio-based samples while all other samples such as geological, atmospheric, marine and archaeological. are run on the 500 kV, NEC 1.5SDH-1 model tandem accelerator, which has been operating since 2001. The data obtained over a six months period for OXI, OXII, ANU sucrose and FIRI-D are discussed. The mean value of ANU sucrose observed to be slightly lower than the consensus value. The processed blanks on SSAMS produce lower apparent age compared to the tandem accelerator as expected.

  6. Dietary Changes over Time in a Caiçara Community from the Brazilian Atlantic Forest

    Directory of Open Access Journals (Sweden)

    Priscila L. MacCord

    2006-12-01

    Full Text Available Because they are occurring at an accelerated pace, changes in the livelihoods of local coastal communities, including nutritional aspects, have been a subject of interest in human ecology. The aim of this study is to explore the dietary changes, particularly in the consumption of animal protein, that have taken place in Puruba Beach, a rural community of caiçaras on the São Paulo Coast, Brazil, over the 10-yr period from 1992-1993 to 2002-2003. Data were collected during six months in 1992-1993 and during the same months in 2002-2003 using the 24-hr recall method. We found an increasing dependence on external products in the most recent period, along with a reduction in fish consumption and in the number of fish species eaten. These changes, possibly associated with other nonmeasured factors such as overfishing and unplanned tourism, may cause food delocalization and a reduction in the use of natural resources. Although the consequences for conservation efforts in the Atlantic Forest and the survival of the caiçaras must still be evaluated, these local inhabitants may be finding a way to reconcile both the old and the new dietary patterns by keeping their houses in the community while looking for sources of income other than natural resources. The prospect shown here may reveal facets that can influence the maintenance of this and other communities undergoing similar processes by, for example, shedding some light on the ecological and economical processes that may occur within their environment and in turn affect the conservation of the resources upon which the local inhabitants depend.

  7. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  8. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  9. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  10. From Corporate Social Responsibility, through Entrepreneurial Orientation, to Knowledge Sharing: A Study in Cai Luong (Renovated Theatre) Theatre Companies

    Science.gov (United States)

    Tuan, Luu Trong

    2015-01-01

    Purpose: This paper aims to examine the role of antecedents such as corporate social responsibility (CSR) and entrepreneurial orientation in the chain effect to knowledge sharing among members of Cai Luong theatre companies in the Vietnamese context. Knowledge sharing contributes to the depth of the knowledge pool of both the individuals and the…

  11. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil

    Directory of Open Access Journals (Sweden)

    Alves Rômulo RN

    2009-11-01

    Full Text Available Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers, with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 native residents. Data were checked through systematic observations and collection of zoological material. Results The dependence on the terrestrial fauna by Caiçaras is especially for food and medicine. The main species used are Didelphis spp., Dasyprocta azarae, Dasypus novemcinctus, and small birds (several species of Turdidae. Contrasting with a high dependency on terrestrial fauna resources by native Amazonians, the Caiçaras do not show a constant dependency on these resources. Nevertheless, the occasional hunting of native animals represents a complimentary source of animal protein. Conclusion Indigenous or local knowledge on native resources is important in order to promote local development in a sustainable way, and can help to conserve biodiversity, particularly if the resource is sporadically used and not commercially exploited.

  12. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  13. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  14. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  15. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  17. Automated Spectral Analysis, the Virtual Observatory and Computational Grids

    Science.gov (United States)

    Jeffery, C. S.

    The newest generation of telescopes and detectors and the facilities like the Virtual Observatory (VO) are delivering vast volumes of astronomical data and creating increasing demands for their analysis and interpretation. Methods for such analyses rely heavily on computer-generated models of growing sophistication and realism. These pose two problems. First, simulations are carried out at increasingly high spatial and temporal resolution and physical dimension. Second, the dimensionality of parameter-search space continues to grow. Major computational problems include ensuring that parameter-space volumes to be searched are physically interesting and to match observational data efficiently and without overloading the computational infrastructure. For the analysis of highly-evolved hot stars, we have developed a toolkit for the modelling of stellar atmospheres and stellar spectra. We can automatically fit observed flux distributions and/or high-resolution spectra and solve for a wide range of atmospheric parameters for both single and binary stars. The software represents a prototype for generic toolkits that could facilitate data analysis within, for example, the VO. We introduce a proposal to integrate a range of such toolkits within a heterogeneous network (such as the VO) so as to facilitate data analysis. For example, functions will be required to combine new observations with data from established archives. A goal-seeking algorithm will use this data to guide a sequence of theoretical calculations. These simulations may need to retrieve data from other sources, atomic data, pre-computed model atmospheres and so on. Such applications using widely distributed and heterogeneous resources will require the emerging technologies of computational grids.

  18. Oxygen, Magnesium, and Aluminum Isotopes in the Ivuna CAI: Re-Examining High-Temperature Fractionations in CI Chondrites

    Science.gov (United States)

    Frank, D. R.; Huss, G. R.; Nagashima, K.; Zolensky, M. E.; Le, L.

    2017-01-01

    CI chondrites are thought to approximate the bulk solar system composition since they closely match the composition of the solar photosphere. Thus, chemical differences between a planetary object and the CI composition are interpreted to result from fractionations of a CI starting composition. This interpretation is often made despite the secondary mineralogy of CI chondrites, which resulted from low-T aqueous alteration on the parent asteroid(s). Prevalent alteration and the relatively large uncertainties in the photospheric abundances (approx. +/-5-10%) permit chemical fractionation of CI chondrites from the bulk solar system, if primary chondrules and/or CAIs have been altered beyond recognition. Isolated olivine and pyroxene grains that range from approx. 5 microns to several hundred microns have been reported in CI chondrites, and acid residues of Orgueil were found to contain refractory oxides with oxygen isotopic compositions matching CAIs. However, the only CAI found to be unambiguously preserved in a CI chondrite was identified in Ivuna. The Ivuna CAI's primary mineralogy, small size (approx.170 microns), and fine-grained igneous texture classify it as a compact type A. Aqueous alteration infiltrated large portions of the CAI, but other regions remain pristine. The major primary phases are melilite (Ak 14-36 ), grossmanite (up to 20.8 wt.% TiO 2 ), and spinel. Both melilite and grossmanite have igneous textures and zoning patterns. An accretionary rim consists primarily of olivine (Fa 2-17 ) and low-Ca pyroxene (Fs 2-10 ), which could be either surviving CI2 material or a third lithology.

  19. A Computer Aided Instruction Tutorial for the Ramtek 9400 Color Graphics Display System at the Naval Postgraduate School Monterey, California.

    Science.gov (United States)

    1981-12-01

    Univesity , 1971 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN INFORMATION SYSTEMS from the NAVAL...of the graphics computer. The Socratic or tutorial method of intruction has long been acknowledged as an educational ideal. It is not normally used...Computer Aided Instruction (CAI) for developing application software is the subject of this thesis. The history and growth of CAI, its educational goals

  20. Computer-assisted instruction: a library service for the community teaching hospital.

    Science.gov (United States)

    McCorkel, J; Cook, V

    1986-04-01

    This paper reports on five years of experience with computer-assisted instruction (CAI) at Winthrop-University Hospital, a major affiliate of the SUNY at Stony Brook School of Medicine. It compares CAI programs available from Ohio State University and Massachusetts General Hospital (accessed by telephone and modem), and software packages purchased from the Health Sciences Consortium (MED-CAPS) and Scientific American (DISCOTEST). The comparison documents one library's experience of the cost of these programs and the use made of them by medical students, house staff, and attending physicians. It describes the space allocated for necessary equipment, as well as the marketing of CAI. Finally, in view of the decision of the National Board of Medical Examiners to administer the Part III examination on computer (the so-called CBX) starting in 1988, the paper speculates on the future importance of CAI in the community teaching hospital.

  1. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  2. SOFTWARE TOOLS FOR COMPUTING EXPERIMENT AIMED AT MULTIVARIATE ANALYSIS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    A. V. Tyurin

    2015-09-01

    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  3. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  4. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  5. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  6. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    Yoneyama, Mitsuru; Kubota, Ryuji; Fujiwara, Tadashi; Sakuma, Hitoshi

    1999-01-01

    The education and training system for Accident Management was developed by the Japanese BWR group and Hitachi Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the education and training system with computer simulations. Both systems are designed to be executed on personal computers. The outlines of the CAI education system and the education and training system with simulator are reported below. These systems provides plant operators and technical support center staff with the effective education and training for accident management. (author)

  7. Gas analysis by computer-controlled microwave rotational spectrometry

    International Nuclear Information System (INIS)

    Hrubesh, L.W.

    1978-01-01

    Microwave rotational spectrometry has inherently high resolution and is thus nearly ideal for qualitative gas mixture analysis. Quantitative gas analysis is also possible by a simplified method which utilizes the ease with which molecular rotational transitions can be saturated at low microwave power densities. This article describes a computer-controlled microwave spectrometer which is used to demonstrate for the first time a totally automated analysis of a complex gas mixture. Examples are shown for a complete qualitative and quantitative analysis, in which a search of over 100 different compounds is made in less than 7 min, with sensitivity for most compounds in the 10 to 100 ppm range. This technique is expected to find increased use in view of the reduced complexity and increased reliabiity of microwave spectrometers and because of new energy-related applications for analysis of mixtures of small molecules

  8. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...... on group contribution and a hybrid approach, where chemical process flowsheets are synthesized in the same way as atoms or groups of atoms are synthesized to form molecules in computer aided molecular design (CAMD) techniques. The building blocks in flowsheet synthesis problem are called as process...

  9. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  13. Analysis of material efficiency aspects of personal computers product group

    OpenAIRE

    TECCHIO PAOLO; ARDENTE FULVIO; MARWEDE MAX; CHRISTIAN CLEMM; DIMITROVA GERGANA; MATHIEUX FABRICE

    2016-01-01

    This report has been developed within the project ‘Technical support for environmental footprinting, material efficiency in product policy and the European Platform on Life Cycle Assessment’ (LCA) (2013-2017) funded by the Directorate-General for Environment. The report summarises the findings of the analysis of material-efficiency aspects of the personal-computer (PC) product group, namely durability, reusability, reparability and recyclability. It also aims to identify material-efficienc...

  14. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  15. Modeling and analysis of the spread of computer virus

    Science.gov (United States)

    Zhu, Qingyi; Yang, Xiaofan; Ren, Jianguo

    2012-12-01

    Based on a set of reasonable assumptions, we propose a novel dynamical model describing the spread of computer virus. Through qualitative analysis, we give a threshold and prove that (1) the infection-free equilibrium is globally asymptotically stable if the threshold is less than one, implying that the virus would eventually die out, and (2) the infection equilibrium is globally asymptotically stable if the threshold is greater than one. Two numerical examples are presented to demonstrate the analytical results.

  16. A Computable OLG Model for Gender and Growth Policy Analysis

    OpenAIRE

    Pierre-Richard Agénor

    2012-01-01

    This paper develops a computable Overlapping Generations (OLG) model for gender and growth policy analysis. The model accounts for human and physical capital accumulation (both public and private), intra- and inter-generational health persistence, fertility choices, and women's time allocation between market work, child rearing, and home production. Bargaining between spouses and gender bias, in the form of discrimination in the work place and mothers' time allocation between daughters and so...

  17. Computer vision analysis captures atypical attention in toddlers with autism.

    Science.gov (United States)

    Campbell, Kathleen; Carpenter, Kimberly Lh; Hashemi, Jordan; Espinosa, Steven; Marsan, Samuel; Borg, Jana Schaich; Chang, Zhuoqing; Qiu, Qiang; Vermeer, Saritha; Adler, Elizabeth; Tepper, Mariano; Egger, Helen L; Baker, Jeffery P; Sapiro, Guillermo; Dawson, Geraldine

    2018-03-01

    To demonstrate the capability of computer vision analysis to detect atypical orienting and attention behaviors in toddlers with autism spectrum disorder. One hundered and four toddlers of 16-31 months old (mean = 22) participated in this study. Twenty-two of the toddlers had autism spectrum disorder and 82 had typical development or developmental delay. Toddlers watched video stimuli on a tablet while the built-in camera recorded their head movement. Computer vision analysis measured participants' attention and orienting in response to name calls. Reliability of the computer vision analysis algorithm was tested against a human rater. Differences in behavior were analyzed between the autism spectrum disorder group and the comparison group. Reliability between computer vision analysis and human coding for orienting to name was excellent (intra-class coefficient 0.84, 95% confidence interval 0.67-0.91). Only 8% of toddlers with autism spectrum disorder oriented to name calling on >1 trial, compared to 63% of toddlers in the comparison group (p = 0.002). Mean latency to orient was significantly longer for toddlers with autism spectrum disorder (2.02 vs 1.06 s, p = 0.04). Sensitivity for autism spectrum disorder of atypical orienting was 96% and specificity was 38%. Older toddlers with autism spectrum disorder showed less attention to the videos overall (p = 0.03). Automated coding offers a reliable, quantitative method for detecting atypical social orienting and reduced sustained attention in toddlers with autism spectrum disorder.

  18. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; N Ramesh; K Vijayalakshmi; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  19. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  20. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  1. Nonlinear dynamics of reaction-diffusion systems: Analysis and computations

    International Nuclear Information System (INIS)

    Wilhelmsson, H.

    1991-01-01

    Equilibria and dynamics of reaction-diffusion systems are studied by means of analysis and computations based on a central expansions method for radially symmetric as well as angularly asymmetric distributions. Effects of boundary conditions are included. The interplay between the different processes in the evolution of the system is considered. The investigation provides a unified description in one, two and three dimensions. A particular application concerns the time evolution of temperature profiles in a fusion reactor plasma. (au)

  2. Ontology-based metrics computation for business process analysis

    OpenAIRE

    Pedrinaci C.; Domingue J.

    2009-01-01

    Business Process Management (BPM) aims to support the whole life-cycle necessary to deploy and maintain business processes in organisations. Crucial within the BPM lifecycle is the analysis of deployed processes. Analysing business processes requires computing metrics that can help determining the health of business activities and thus the whole enterprise. However, the degree of automation currently achieved cannot support the level of reactivity and adaptation demanded by businesses. In thi...

  3. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  4. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  5. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  6. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  7. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  8. Shielding analysis methods available in the scale computational system

    Energy Technology Data Exchange (ETDEWEB)

    Parks, C.V.; Tang, J.S.; Hermann, O.W.; Bucholz, J.A.; Emmett, M.B.

    1986-01-01

    Computational tools have been included in the SCALE system to allow shielding analysis to be performed using both discrete-ordinates and Monte Carlo techniques. One-dimensional discrete ordinates analyses are performed with the XSDRNPM-S module, and point dose rates outside the shield are calculated with the XSDOSE module. Multidimensional analyses are performed with the MORSE-SGC/S Monte Carlo module. This paper will review the above modules and the four Shielding Analysis Sequences (SAS) developed for the SCALE system. 7 refs., 8 figs.

  9. ASAS: Computational code for Analysis and Simulation of Atomic Spectra

    Directory of Open Access Journals (Sweden)

    Jhonatha R. dos Santos

    2017-01-01

    Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.

  10. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  11. A computational network analysis based on targets of antipsychotic agents.

    Science.gov (United States)

    Gao, Lei; Feng, Shuo; Liu, Zhao-Yuan; Wang, Jiu-Qiang; Qi, Ke-Ke; Wang, Kai

    2018-03-01

    Currently, numerous antipsychotic agents have been developed in the area of pharmacological treatment of schizophrenia. However, the molecular mechanism underlying multi targets of antipsychotics were yet to be explored. In this study we performed a computational network analysis based on targets of antipsychotic agents. We retrieved a total of 96 targets from 56 antipsychotic agents. By expression enrichment analysis, we identified that the expressions of antipsychotic target genes were significantly enriched in liver, brain, blood and corpus striatum. By protein-protein interaction (PPI) network analysis, a PPI network with 77 significantly interconnected target genes was generated. By historeceptomics analysis, significant brain region specific target-drug interactions were identified in targets of dopamine receptors (DRD1-Olanzapine in caudate nucleus and pons (P-valueantipsychotic targets and insights for molecular mechanism of antipsychotic agents. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Giraffe, a Computer Assisted Instruction Programme.

    Science.gov (United States)

    Boekhorst, Albert K.; Groot, Tineke

    In 1989 a two year collaborative project, CAI (Computer Assisted Instruction) & Humanities, was initiated between the Faculty of Arts and IBM Netherlands during which General Information Retrieval All Faculties For Bibliographic Education (GIRAFFE), a program for the retrieval of information on general bibliographies, was developed. The…

  13. Innovation in nursing education: development of computer-assisted thinking.

    Science.gov (United States)

    Kanai-Pak, M; Hosoi, R; Arai, C; Ishii, Y; Seki, M; Kikuchi, Y; Kabasawa, K; Sato, K

    1997-01-01

    In order to enhance students' active thinking, faculty members at International University of Health and Welfare developed the CAT (Computer Assisted Thinking) program. The CAT program is different from CAI (Computer Assisted Instruction), which mainly asks users to choose correct answers. Instead, the CAT program asks users to type in short sentences. There are two functions in the CAT program: one is to keep the students' action log each time they use the program and the other is to serve as medical dictionary. An analysis of the action log revealed that the students demonstrated little skill in inferential thinking. Their observations were very concrete. In order to help the students to develop their abstract thinking skills, we need to review our curriculum.

  14. Analysis of multigrid methods on massively parallel computers: Architectural implications

    Science.gov (United States)

    Matheson, Lesley R.; Tarjan, Robert E.

    1993-01-01

    We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.

  15. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  16. Computer image analysis of etched tracks from ionizing radiation

    Science.gov (United States)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  17. Computational analysis of RNA structures with chemical probing data.

    Science.gov (United States)

    Ge, Ping; Zhang, Shaojie

    2015-06-01

    RNAs play various roles, not only as the genetic codes to synthesize proteins, but also as the direct participants of biological functions determined by their underlying high-order structures. Although many computational methods have been proposed for analyzing RNA structures, their accuracy and efficiency are limited, especially when applied to the large RNAs and the genome-wide data sets. Recently, advances in parallel sequencing and high-throughput chemical probing technologies have prompted the development of numerous new algorithms, which can incorporate the auxiliary structural information obtained from those experiments. Their potential has been revealed by the secondary structure prediction of ribosomal RNAs and the genome-wide ncRNA function annotation. In this review, the existing probing-directed computational methods for RNA secondary and tertiary structure analysis are discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  19. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  20. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  1. Assessing Computational Steps for CLIP-Seq Data Analysis

    Directory of Open Access Journals (Sweden)

    Qi Liu

    2015-01-01

    Full Text Available RNA-binding protein (RBP is a key player in regulating gene expression at the posttranscriptional level. CLIP-Seq, with the ability to provide a genome-wide map of protein-RNA interactions, has been increasingly used to decipher RBP-mediated posttranscriptional regulation. Generating highly reliable binding sites from CLIP-Seq requires not only stringent library preparation but also considerable computational efforts. Here we presented a first systematic evaluation of major computational steps for identifying RBP binding sites from CLIP-Seq data, including preprocessing, the choice of control samples, peak normalization, and motif discovery. We found that avoiding PCR amplification artifacts, normalizing to input RNA or mRNAseq, and defining the background model from control samples can reduce the bias introduced by RNA abundance and improve the quality of detected binding sites. Our findings can serve as a general guideline for CLIP experiments design and the comprehensive analysis of CLIP-Seq data.

  2. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    Wenzel, D.R.

    1994-02-01

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  3. Computer code for general analysis of radon risks (GARR)

    International Nuclear Information System (INIS)

    Ginevan, M.

    1984-09-01

    This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables

  4. Automatic computer analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Phillips, G.W.

    1979-01-01

    Techniques for the automatic computer analysis of high-resolution gamma-ray spectra for peak area and position are discussed. The computer program HYPERMET is reviewed. The importance of keeping user input simple and short is emphasized. Peak-search methods are discussed and compared for efficiency. A semiempirical peak-shape function is presented which gives a good fit to the variety of peak shapes and intensities that may be found in a spectrum. The importance of a residual search in locating and fitting multiple peaks is demonstrated. Finally, it is shown that a severe bias may be encountered when the usual least-squares fitting methods are applied to peaks with very low statistics, and methods for alleviating this are presented. 7 figures

  5. Computational methodology for ChIP-seq analysis

    Science.gov (United States)

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  6. Crystal structures of coordination polymers from CaI2 and proline

    Directory of Open Access Journals (Sweden)

    Kevin Lamberts

    2015-06-01

    Full Text Available Completing our reports concerning the reaction products from calcium halides and the amino acid proline, two different solids were found for the reaction of l- and dl-proline with CaI2. The enantiopure amino acid yields the one-dimensional coordination polymer catena-poly[[aqua-μ3-l-proline-tetra-μ2-l-proline-dicalcium] tetraiodide 1.7-hydrate], {[Ca2(C5H9NO25(H2O]I4·1.7H2O}n, (1, with two independent Ca2+ cations in characteristic seven- and eightfold coordination. Five symmetry-independent zwitterionic l-proline molecules bridge the metal sites into a cationic polymer. Racemic proline forms with Ca2+ cations heterochiral chains of the one-dimensional polymer catena-poly[[diaquadi-μ2-dl-proline-calcium] diiodide], {[Ca(C5H9NO22(H2O2]I2}n, (2. The centrosymmetric structure is built by one Ca2+ cation that is bridged towards its symmetry equivalents by two zwitterionic proline molecules. In both structures, the iodide ions remain non-coordinating and hydrogen bonds are formed between these counter-anions, the amino groups, coordinating and co-crystallized water molecules. While the overall composition of (1 and (2 is in line with other structures from calcium halides and amino acids, the diversity of the carboxylate coordination geometry is quite surprising.

  7. Analysis of pellet coating uniformity using a computer scanner.

    Science.gov (United States)

    Šibanc, Rok; Luštrik, Matevž; Dreu, Rok

    2017-11-30

    A fast method for pellet coating uniformity analysis, using a commercial computer scanner was developed. The analysis of the individual particle coating thicknesses was based on using a transparent orange colored coating layer deposited on white pellet cores. Besides the analysis of the coating thickness the information of pellet size and shape was obtained as well. Particle size dependent coating thickness and particle size independent coating variability was calculated by combining the information of coating thickness and pellet size. Decoupling coating thickness variation sources is unique to presented method. For each coating experiment around 10000 pellets were analyzed, giving results with a high statistical confidence. Proposed method was employed for the performance evaluation of classical Wurster and swirl enhanced Wurster coater operated at different gap settings and air flow rates. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  11. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  12. Changes of Benthic Macroinvertebrates in Thi Vai River and Cai Mep Estuaries Under Polluted Conditions with Industrial Wastewater

    Directory of Open Access Journals (Sweden)

    Huong Nguyen Thi Thanh

    2017-06-01

    Full Text Available The pollution on the Thi Vai River has been spreading out rapidly over the two lasted decades caused by the wastewater from the industrial parks in the left bank of Thi Vai River and Cai Mep Estuaries. The evaluation of the benthic macroinvertebrate changes was very necessary to identify the consequences of the industrial wastewater on water quality and aquatic ecosystem of Thi Vai River and Cai Mep Estuaries. In this study, the variables of benthic macroinvertebrates and water quality were investigated in Thi Vai River and Cai Mep Estuaries, Southern Vietnam. The monitoring data of benthic macroinvertebrates and water quality parameters covered the period from 1989 to 2015 at 6 sampling sites in Thi Vai River and Cai Mep Estuaries. The basic water quality parameters were also tested including pH, dissolved oxygen (DO, total nitrogen, and total phosphorus. The biodiversity indices of benthic macroinvertebrates were applied for water quality assessment. The results showed that pH ranged from 6.4 – 7.6 during the monitoring. The DO concentrations were in between 0.20 - 6.70 mg/L. The concentrations of total nitrogen and total phosphorous ranged from 0.03 - 5.70 mg/L 0.024 - 1.380 mg/L respectively. Macroinvertebrate community in the study area consisted of 36 species of polychaeta, gastropoda, bivalvia, and crustacea, of which, species of polychaeta were dominant in species number. The benthic macroinvertebartes density ranged from 0 - 2.746 individuals/m−1 with the main dominant species of Neanthes caudata, Prionospio malmgreni, Paraprionospio pinnata, Trichochaeta carica, Maldane sarsi, Capitella capitata, Terebellides stroemi, Euditylia polymorpha, Grandidierella lignorum, Apseudes vietnamensis. The biodiversity index values during the monitoring characterized for aquatic environmental conditions of mesotrophic to polytrophic. Besides, species richness positively correlated with DO, total nitrogen, and total phosphorus. The results

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  14. Students' perceptions of a multimedia computer-aided instruction ...

    African Journals Online (AJOL)

    Objective. To develop an interactive muttimedia-based computer-aided instruction (CAI) programme, to detennine its educational worth and efficacy in a multicuttural academic environment and to evaluate its usage by students with differing levels of computer literacy. Design. A prospective descriptive study evaluating ...

  15. Students' perceptions of a multimedia computer-aided instruction ...

    African Journals Online (AJOL)

    Objective. To develop an interactive muttimedia-based computer-aided instruction (CAI) programme, to detennine its educational worth and efficacy in a multicuttural academic environment and to evaluate its usage by students with differing levels of computer literacy. Design. A prospective descriptive study evaluating pre-.

  16. Computed tomographic beam-hardening artefacts: mathematical characterization and analysis.

    Science.gov (United States)

    Park, Hyoung Suk; Chung, Yong Eun; Seo, Jin Keun

    2015-06-13

    This paper presents a mathematical characterization and analysis of beam-hardening artefacts in X-ray computed tomography (CT). In the field of dental and medical radiography, metal artefact reduction in CT is becoming increasingly important as artificial prostheses and metallic implants become more widespread in ageing populations. Metal artefacts are mainly caused by the beam-hardening of polychromatic X-ray photon beams, which causes mismatch between the actual sinogram data and the data model being the Radon transform of the unknown attenuation distribution in the CT reconstruction algorithm. We investigate the beam-hardening factor through a mathematical analysis of the discrepancy between the data and the Radon transform of the attenuation distribution at a fixed energy level. Separation of cupping artefacts from beam-hardening artefacts allows causes and effects of streaking artefacts to be analysed. Various computer simulations and experiments are performed to support our mathematical analysis. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  17. Computational Fatigue Life Analysis of Carbon Fiber Laminate

    Science.gov (United States)

    Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.

    2018-02-01

    In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.

  18. RADTRAN 5: A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air

  19. Dynamical Analysis of a Computer Virus Model with Delays

    Directory of Open Access Journals (Sweden)

    Juan Liu

    2016-01-01

    Full Text Available An SIQR computer virus model with two delays is investigated in the present paper. The linear stability conditions are obtained by using characteristic root method and the developed asymptotic analysis shows the onset of a Hopf bifurcation occurs when the delay parameter reaches a critical value. Moreover the direction of the Hopf bifurcation and stability of the bifurcating period solutions are investigated by using the normal form theory and the center manifold theorem. Finally, numerical investigations are carried out to show the feasibility of the theoretical results.

  20. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    Science.gov (United States)

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Integrated computer codes for nuclear power plant severe accident analysis

    International Nuclear Information System (INIS)

    Jordanov, I.; Khristov, Y.

    1995-01-01

    This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs

  2. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  3. Proceedings of the 1982 Army Numerical Analysis and Computers Conference.

    Science.gov (United States)

    1982-08-01

    unit square were x&(xUx)x + x (ycUy)x + ygx(Uxly + yO( yUy )y +x(xU) + + y (XnUx)y +y(yU )y Xp- ypy 0 t(XTUxx xi fyyx T xyqiy iy x~p y for the first...dependent models ( Cheng , et al., 1975; Leendertse and Liu, 197b; Sheng, 1975; Forristal, et al., 1977; and Sheng, et al., 1978) which are more general. In...Procedures for Solving the Shallow-Water Equations in Transformed Coordinates," Proc. 1982 Army Numerical Analysis and Computer Conference. Cheng , R.T

  4. A computer program for automatic gamma-ray spectra analysis

    International Nuclear Information System (INIS)

    Hiromura, Kazuyuki

    1975-01-01

    A computer program for automatic analysis of gamma-ray spectra obtained with a Ge(Li) detector is presented. The program includes a method by comparing the successive values of experimental data for the automatic peak finding and method of leastsquares for the peak fitting. The peak shape in the fitting routine is a 'modified Gaussian', which consists of two different Gaussians with the same height joined at the centroid. A quadratic form is chosen as a function representing the background. A maximum of four peaks can be treated in the fitting routine by the program. Some improvements in question are described. (auth.)

  5. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  6. G-computation demonstration in causal mediation analysis.

    Science.gov (United States)

    Wang, Aolin; Arah, Onyebuchi A

    2015-10-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings.

  7. Cepstrum analysis and applications to computational fluid dynamic solutions

    Science.gov (United States)

    Meadows, Kristine R.

    1990-04-01

    A novel approach to the problem of spurious reflections introduced by artificial boundary conditions in computational fluid dynamic (CFD) solutions is proposed. Instead of attempting to derive non-reflecting boundary conditions, the approach is to accept the fact that spurious reflections occur, but to remove these reflections with cepstrum analysis, a signal processing technique which has been successfully used to remove echoes from experimental data. First, the theory of the cepstrum method is presented. This includes presentation of two types of cepstra: The Power Cepstrum and the Complex Cepstrum. The definitions of the cepstrum methods are applied theoretically and numerically to the analytical solution of sinusoidal plane wave propagation in a duct. One-D and 3-D time dependent solutions to the Euler equations are computed, and hard-wall conditions are prescribed at the numerical boundaries. The cepstrum method is applied, and the reflections from the boundaries are removed from the solutions. One-D and 3-D solutions are computed with so called nonreflecting boundary conditions, and these solutions are compared to those obtained by prescribing hard wall conditions and processing with the cepstrum.

  8. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Wang, Aolin; Arah, Onyebuchi A.

    2015-01-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  9. Role of betaine:CoA ligase (CaiC) in the activation of betaines and the transfer of coenzyme A in Escherichia coli.

    Science.gov (United States)

    Bernal, V; Arense, P; Blatz, V; Mandrand-Berthelot, M A; Cánovas, M; Iborra, J L

    2008-07-01

    Characterization of the role of CaiC in the biotransformation of trimethylammonium compounds into l(-)-carnitine in Escherichia coli. The caiC gene was cloned and overexpressed in E. coli and its effect on the production of l(-)-carnitine was analysed. Betaine:CoA ligase and CoA transferase activities were analysed in cell free extracts and products were studied by electrospray mass spectrometry (ESI-MS). Substrate specificity of the caiC gene product was high, reflecting the high specialization of the carnitine pathway. Although CoA-transferase activity was also detected in vitro, the main in vivo role of CaiC was found to be the synthesis of betainyl-CoAs. Overexpression of CaiC allowed the biotransformation of crotonobetaine to l(-)-carnitine to be enhanced nearly 20-fold, the yield reaching up to 30% (with growing cells). Higher yields were obtained using resting cells (up to 60%), even when d(+)-carnitine was used as substrate. The expression of CaiC is a control step in the biotransformation of trimethylammonium compounds in E. coli. A bacterial betaine:CoA ligase has been characterized for the first time, underlining its important role for the production of l-carnitine with Escherichia coli.

  10. Titanium isotopes and rare earth patterns in CAIs: Evidence for thermal processing and gas-dust decoupling in the protoplanetary disk

    Science.gov (United States)

    Davis, Andrew M.; Zhang, Junjun; Greber, Nicolas D.; Hu, Jingya; Tissot, François L. H.; Dauphas, Nicolas

    2018-01-01

    Titanium isotopic compositions (mass-dependent fractionation and isotopic anomalies) were measured in 46 calcium-, aluminum-rich inclusions (CAIs) from the Allende CV chondrite. After internal normalization to 49Ti/47Ti, we found that ε50Ti values are somewhat variable among CAIs, and that ε46Ti is highly correlated with ε50Ti, with a best-fit slope of 0.162 ± 0.030 (95% confidence interval). The linear correlation between ε46Ti and ε50Ti extends the same correlation seen among bulk solar objects (slope 0.184 ± 0.007). This observation provides constraints on dynamic mixing of the solar disk and has implications for the nucleosynthetic origin of titanium isotopes, specifically on the possible contributions from various types of supernovae to the solar system. Titanium isotopic mass fractionation, expressed as δ‧49Ti, was measured by both sample-standard bracketing and double-spiking. Most CAIs are isotopically unfractionated, within a 95% confidence interval of normal, but a few are significantly fractionated and the range δ‧49Ti is from ∼-4 to ∼+4. Rare earth element patterns were measured in 37 of the CAIs. All CAIs with significant titanium mass fractionation effects have group II and related REE patterns, implying kinetically controlled volatility fractionation during the formation of these CAIs.

  11. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  12. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  13. Structural characterisation of semiconductors by computer methods of image analysis

    Science.gov (United States)

    Hernández-Fenollosa, M. A.; Cuesta-Frau, D.; Damonte, L. C.; Satorre Aznar, M. A.

    2005-08-01

    Analysis of microscopic images for automatic particle detection and extraction is a field of growing interest in many scientific fields such as biology, medicine and physics. In this paper we present a method to analyze microscopic images of semiconductors in order to, in a non-supervised way, obtain the main characteristics of the sample under test: growing regions, grain sizes, dendrite morphology and homogenization. In particular, nanocrystalline semiconductors with dimension less than 100 nm represent a relatively new class of materials. Their short-range structures are essentially the same as bulk semiconductors but their optical and electronic properties are dramatically different. The images are obtained by scanning electron microscopy (SEM) and processed by the computer methods presented. Traditionally these tasks have been performed manually, which is time-consuming and subjective in contrast to our computer analysis. The images acquired are first pre-processed in order to improve the signal-to-noise ratio and therefore the detection rate. Images are filtered by a weighted-median filter, and contrast is enhanced using histogram equalization. Then, images are thresholded using a binarization algorithm in such a way growing regions will be segmented. This segmentation is based on the different grey levels due to different sample height of the growing areas. Next, resulting image is further processed to eliminate the resulting holes and spots of the previous stage, and this image will be used to compute the percentage of such growing areas. Finally, using pattern recognition techniques (contour following and raster to vector transformation), single crystals are extracted to obtain their characteristics.

  14. Computer-aided analysis of cutting processes for brittle materials

    Science.gov (United States)

    Ogorodnikov, A. I.; Tikhonov, I. N.

    2017-12-01

    This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.

  15. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  16. Computational design analysis for deployment of cardiovascular stents

    International Nuclear Information System (INIS)

    Tammareddi, Sriram; Sun Guangyong; Li Qing

    2010-01-01

    Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.

  17. A compendium of computer codes in fault tree analysis

    International Nuclear Information System (INIS)

    Lydell, B.

    1981-03-01

    In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)

  18. Computational design analysis for deployment of cardiovascular stents

    Science.gov (United States)

    Tammareddi, Sriram; Sun, Guangyong; Li, Qing

    2010-06-01

    Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.

  19. Compendium of computer codes for the safety analysis of LMFBR's

    International Nuclear Information System (INIS)

    1975-06-01

    A high level of mathematical sophistication is required in the safety analysis of LMFBR's to adequately meet the demands for realism and confidence in all areas of accident consequence evaluation. The numerical solution procedures associated with these analyses are generally so complex and time consuming as to necessitate their programming into computer codes. These computer codes have become extremely powerful tools for safety analysis, combining unique advantages in accuracy, speed and cost. The number, diversity and complexity of LMFBR safety codes in the U. S. has grown rapidly in recent years. It is estimated that over 100 such codes exist in various stages of development throughout the country. It is inevitable that such a large assortment of codes will require rigorous cataloguing and abstracting to aid individuals in identifying what is available. It is the purpose of this compendium to provide such a service through the compilation of code summaries which describe and clarify the status of domestic LMFBR safety codes. (U.S.)

  20. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  1. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  2. Computer vision inspection of rice seed quality with discriminant analysis

    Science.gov (United States)

    Cheng, Fang; Ying, Yibin

    2004-10-01

    This study was undertaken to develop computer vision-based rice seeds inspection technology for quality control. Color image classification using a discriminant analysis algorithm identifying germinated rice seed was successfully implemented. The hybrid rice seed cultivars involved were Jinyou402, Shanyou10, Zhongyou207 and Jiayou99. Sixteen morphological features and six color features were extracted from sample images belong to training sets. The color feature of 'Huebmean' shows the strongest classification ability among all the features. Computed as the area of seed region divided by area of the smallest convex polygon that can contain the seed region, the feature of 'Solidity' is prior to the other morphological features in germinated seeds recognition. Combined with the two features of 'Huebmean' and 'Solidity', discriminant analysis was used to classify normal rice seeds and seeds germinated on panicle. Results show that the algorithm achieved an overall average accuracy of 98.4% for both of normal seeds and germinated seeds in all cultivars. The combination of 'Huebmean' and 'Solidity' was proved to be a good indicator for germinated seeds. The simple discriminant algorithm using just two features shows high accuracy and good adaptability.

  3. Nuclear power reactor analysis, methods, algorithms and computer programs

    International Nuclear Information System (INIS)

    Matausek, M.V

    1981-01-01

    Full text: For a developing country buying its first nuclear power plants from a foreign supplier, disregarding the type and scope of the contract, there is a certain number of activities which have to be performed by local stuff and domestic organizations. This particularly applies to the choice of the nuclear fuel cycle strategy and the choice of the type and size of the reactors, to bid parameters specification, bid evaluation and final safety analysis report evaluation, as well as to in-core fuel management activities. In the Nuclear Engineering Department of the Boris Kidric Institute of Nuclear Sciences (NET IBK) the continual work is going on, related to the following topics: cross section and resonance integral calculations, spectrum calculations, generation of group constants, lattice and cell problems, criticality and global power distribution search, fuel burnup analysis, in-core fuel management procedures, cost analysis and power plant economics, safety and accident analysis, shielding problems and environmental impact studies, etc. The present paper gives the details of the methods developed and the results achieved, with the particular emphasis on the NET IBK computer program package for the needs of planning, construction and operation of nuclear power plants. The main problems encountered so far were related to small working team, lack of large and powerful computers, absence of reliable basic nuclear data and shortage of experimental and empirical results for testing theoretical models. Some of these difficulties have been overcome thanks to bilateral and multilateral cooperation with developed countries, mostly through IAEA. It is the authors opinion, however, that mutual cooperation of developing countries, having similar problems and similar goals, could lead to significant results. Some activities of this kind are suggested and discussed. (author)

  4. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  5. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  6. Computer-Aided analysis of human esophageal peristalsis. I. Technical description and comparison with manual analysis.

    Science.gov (United States)

    Castell, D O; Dubois, A; Davis, C R; Cordova, C M; Norman, D O

    1984-01-01

    Manual and computer analysis of esophageal peristaltic activity induced by swallows of 5ml water were compared in 6 healthy subjects under basal conditions and following i.v. injection of 4 pharmacological agents: edrophonium (E, 0.08mg/kg), atropine (A, 0.6mg), pentagastrin (PG, 0.6mcg/kg), and glucagon (GL, lmcg). Esophageal manometry was performed using a low compliance perfusion system and recorded on paper for standard manual analysis. The signal was concurrently taped on an analog recorder for subsequent digitization and analysis on a PDP-11 computer using a locally developed program. With both methods we determined the wave amplitude, duration, average upward slope (dP/dT), and velocity of wave progression. In addition, the computer allowed calculation of area under each wave and maximum upward slope (Max dP/dT). We found no significant difference between results of the parameters measured using both methods. Wave amplitude was significantly increased by E and significantly decreased by A. Average upward slope was decreased and velocity was significantly increased only by A. Computer-calculated wave area and Max dP/dT were significantly changed by both E and A. PG and GL had no effect on any of the measured parameters of the peristaltic wave. Esophageal peristalsis can be analyzed using a computer-aided method, providing a rapid and objective measurement of classical parameters and access to more in-depth analysis.

  7. Computer Vision and Computer Graphics Analysis of Paintings and Drawings: An Introduction to the Literature

    Science.gov (United States)

    Stork, David G.

    In the past few years, a number of scholars trained in computer vision, pattern recognition, image processing, computer graphics, and art history have developed rigorous computer methods for addressing an increasing number of problems in the history of art. In some cases, these computer methods are more accurate than even highly trained connoisseurs, art historians and artists. Computer graphics models of artists’ studios and subjects allow scholars to explore ‘‘what if’’ scenarios and determine artists’ studio praxis. Rigorous computer ray-tracing software sheds light on claims that some artists employed optical tools. Computer methods will not replace tradition art historical methods of connoisseurship but enhance and extend them. As such, for these computer methods to be useful to the art community, they must continue to be refined through application to a variety of significant art historical problems.

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  9. Computational Analysis of the G-III Laminar Flow Glove

    Science.gov (United States)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  10. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  11. A computer system for the analysis of integrated circuit reliability

    Science.gov (United States)

    Mauri, P.

    1989-12-01

    The formulation of total reliability assessment of integrated circuits involves an increasing amount of knowledge and data and hence it requires increasing computerized assistance. To perform this an information system has been designed and implemented. Following engineering practice, the key features of the system are (1) the collection of different types of data, e.g. electrical parameter measurements and qualitative description of the mode and the mechanism of failure and (2) the implementation of procedures coming from methods often applied, e.g. statistical or new approaches, such as formalization of cause-effect chains as studied for artificial intelligence applications. The system architecture has been designed so as to allow direct user maintenance, and hence a quick updating of reliability knowledge and information. Furthermore, its modularity eases the implementation of new procedures. Computer support improves the quality of data analysis and allows for the application of new methods and models.

  12. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  13. Computational Fluid Dynamics Analysis of an Evaporative Cooling System

    Directory of Open Access Journals (Sweden)

    Kapilan N.

    2016-11-01

    Full Text Available The use of chlorofluorocarbon based refrigerants in the air-conditioning system increases the global warming and causes the climate change. The climate change is expected to present a number of challenges for the built environment and an evaporative cooling system is one of the simplest and environmentally friendly cooling system. The evaporative cooling system is most widely used in summer and in rural and urban areas of India for human comfort. In evaporative cooling system, the addition of water into air reduces the temperature of the air as the energy needed to evaporate the water is taken from the air. Computational fluid dynamics is a numerical analysis and was used to analyse the evaporative cooling system. The CFD results are matches with the experimental results.

  14. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  15. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  16. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  17. Computer-aided photometric analysis of dynamic digital bioluminescent images

    Science.gov (United States)

    Gorski, Zbigniew; Bembnista, T.; Floryszak-Wieczorek, J.; Domanski, Marek; Slawinski, Janusz

    2003-04-01

    The paper deals with photometric and morphologic analysis of bioluminescent images obtained by registration of light radiated directly from some plant objects. Registration of images obtained from ultra-weak light sources by the single photon counting (SPC) technique is the subject of this work. The radiation is registered by use of a 16-bit charge coupled device (CCD) camera "Night Owl" together with WinLight EG&G Berthold software. Additional application-specific software has been developed in order to deal with objects that are changing during the exposition time. Advantages of the elaborated set of easy configurable tools named FCT for a computer-aided photometric and morphologic analysis of numerous series of quantitatively imperfect chemiluminescent images are described. Instructions are given how to use these tools and exemplified with several algorithms for the transformation of images library. Using the proposed FCT set, automatic photometric and morphologic analysis of the information hidden within series of chemiluminescent images reflecting defensive processes in poinsettia (Euphorbia pulcherrima Willd) leaves affected by a pathogenic fungus Botrytis cinerea is revealed.

  18. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-11-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique

  19. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-01-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique

  20. Computer codes for the analysis of flask impact problems

    International Nuclear Information System (INIS)

    Neilson, A.J.

    1984-09-01

    This review identifies typical features of the design of transportation flasks and considers some of the analytical tools required for the analysis of impact events. Because of the complexity of the physical problem, it is unlikely that a single code will adequately deal with all the aspects of the impact incident. Candidate codes are identified on the basis of current understanding of their strengths and limitations. It is concluded that the HONDO-II, DYNA3D AND ABAQUS codes which ar already mounted on UKAEA computers will be suitable tools for use in the analysis of experiments conducted in the proposed AEEW programme and of general flask impact problems. Initial attention should be directed at the DYNA3D and ABAQUS codes with HONDO-II being reserved for situations where the three-dimensional elements of DYNA3D may provide uneconomic simulations in planar or axisymmetric geometries. Attention is drawn to the importance of access to suitable mesh generators to create the nodal coordinate and element topology data required by these structural analysis codes. (author)

  1. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  2. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  3. Computational identification and analysis of single-nucleotide ...

    Indian Academy of Sciences (India)

    Department of Biotechnology and Bioinformatics, Jaypee University of Information Technology (JUIT), Waknaghat, Teh Kandaghat, Solan 173 234, India; School of Computer Science and Information Technology, Devi Ahilya Vishwavidyalaya (DAVV), Indore 452 013, India; Computational Biology Group, Abhyudaya ...

  4. COMPUTING

    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  5. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    Planimetrical measurements are made to calculate the area of an entity. By digitizing the entity the planimetrical measurements may be done by computer. This computer program was developed in conjunction with a research project involving measurement of the pneumatized cell system of the temporal...... computer, a digitizer tablet and a printer....

  6. Reliability of Computer Analysis of Electrocardiograms (ECG) of ...

    African Journals Online (AJOL)

    Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...

  7. Computational Method for Global Sensitivity Analysis of Reactor Neutronic Parameters

    Directory of Open Access Journals (Sweden)

    Bolade A. Adetula

    2012-01-01

    Full Text Available The variance-based global sensitivity analysis technique is robust, has a wide range of applicability, and provides accurate sensitivity information for most models. However, it requires input variables to be statistically independent. A modification to this technique that allows one to deal with input variables that are blockwise correlated and normally distributed is presented. The focus of this study is the application of the modified global sensitivity analysis technique to calculations of reactor parameters that are dependent on groupwise neutron cross-sections. The main effort in this work is in establishing a method for a practical numerical calculation of the global sensitivity indices. The implementation of the method involves the calculation of multidimensional integrals, which can be prohibitively expensive to compute. Numerical techniques specifically suited to the evaluation of multidimensional integrals, namely, Monte Carlo and sparse grids methods, are used, and their efficiency is compared. The method is illustrated and tested on a two-group cross-section dependent problem. In all the cases considered, the results obtained with sparse grids achieved much better accuracy while using a significantly smaller number of samples. This aspect is addressed in a ministudy, and a preliminary explanation of the results obtained is given.

  8. Computer Assisted Data Analysis in the Dye Dilution Technique for Plasma Volume Measurement.

    Science.gov (United States)

    Bishop, Marvin; Robinson, Gerald D.

    1981-01-01

    Describes a method for undergraduate physiology students to measure plasma volume by the dye dilution technique, in which a computer is used to interpret data. Includes the computer program for the data analysis. (CS)

  9. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  10. RADTRAN 5 - A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1993-01-01

    The RADTRAN 5 computer code has been developed to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI standard FORTRAN 77; the code contains significant advances in the methodology first pioneered with the LINK option of RADTRAN 4. A major application of the LINK methodology is route-specific analysis. Another application is comparisons of attributes along the same route segments. Nonradiological risk factors have been incorporated to allow users to estimate nonradiological fatalities and injuries that might occur during the transportation event(s) being analyzed. These fatalities include prompt accidental fatalities from mechanical causes. Values of these risk factors for the United States have been made available in the code as optional defaults. Several new health effects models have been published in the wake of the Hiroshima-Nagasaki dosimetry reassessment, and this has emphasized the need for flexibility in the RADTRAN approach to health-effects calculations. Therefore, the basic set of health-effects conversion equations in RADTRAN have been made user-definable. All parameter values can be changed by the user, but a complete set of default values are available for both the new International Commission on Radiation Protection model (ICRP Publication 60) and the recent model of the U.S. National Research Council's Committee on the Biological Effects of Radiation (BEIR V). The meteorological input data tables have been modified to permit optional entry of maximum downwind distances for each dose isopleth. The expected dose to an individual in each isodose area is also calculated and printed automatically. Examples are given that illustrate the power and flexibility of the RADTRAN 5 computer code. (J.P.N.)

  11. Genome Assembly and Computational Analysis Pipelines for Bacterial Pathogens

    KAUST Repository

    Rangkuti, Farania Gama Ardhina

    2011-06-01

    Pathogens lie behind the deadliest pandemics in history. To date, AIDS pandemic has resulted in more than 25 million fatal cases, while tuberculosis and malaria annually claim more than 2 million lives. Comparative genomic analyses are needed to gain insights into the molecular mechanisms of pathogens, but the abundance of biological data dictates that such studies cannot be performed without the assistance of computational approaches. This explains the significant need for computational pipelines for genome assembly and analyses. The aim of this research is to develop such pipelines. This work utilizes various bioinformatics approaches to analyze the high-­throughput genomic sequence data that has been obtained from several strains of bacterial pathogens. A pipeline has been compiled for quality control for sequencing and assembly, and several protocols have been developed to detect contaminations. Visualization has been generated of genomic data in various formats, in addition to alignment, homology detection and sequence variant detection. We have also implemented a metaheuristic algorithm that significantly improves bacterial genome assemblies compared to other known methods. Experiments on Mycobacterium tuberculosis H37Rv data showed that our method resulted in improvement of N50 value of up to 9697% while consistently maintaining high accuracy, covering around 98% of the published reference genome. Other improvement efforts were also implemented, consisting of iterative local assemblies and iterative correction of contiguated bases. Our result expedites the genomic analysis of virulent genes up to single base pair resolution. It is also applicable to virtually every pathogenic microorganism, propelling further research in the control of and protection from pathogen-­associated diseases.

  12. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  13. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Science.gov (United States)

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  14. Trident: scalable compute archives: workflows, visualization, and analysis

    Science.gov (United States)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  15. Clinical diagnosis and computer analysis of headache symptoms.

    OpenAIRE

    Drummond, P D; Lance, J W

    1984-01-01

    The headache histories obtained from clinical interviews of 600 patients were analysed by computer to see whether patients could be separated systematically into clinical categories and to see whether sets of symptoms commonly reported together differed in distribution among the categories. The computer classification procedure assigned 537 patients to the same category as their clinical diagnosis, the majority of discrepancies between clinical and computer classifications involving common mi...

  16. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  17. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  18. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    Rocca, H.C.

    1976-10-01

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author) [es

  19. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  20. COMPUTATIONAL ANALYSIS OF BACKWARD-FACING STEP FLOW

    Directory of Open Access Journals (Sweden)

    Erhan PULAT

    2001-01-01

    Full Text Available In this study, backward-facing step flow that are encountered in electronic systems cooling, heat exchanger design, and gas turbine cooling are investigated computationally. Steady, incompressible, and two-dimensional air flow is analyzed. Inlet velocity is assumed uniform and it is obtained from parabolic profile by using maximum velocity. In the analysis, the effects of channel expansion ratio and Reynolds number to reattachment length are investigated. In addition, pressure distribution throughout the channel length is also obtained and flow is analyzed for the Reynolds number values of 50 and 150 and channel expansion ratios of 1.5 and 2. Governing equations are solved by using Galerkin finite element mothod of ANSYS-FLOTRAN code. Obtained results are compared with the solutions of lattice BGK method that is relatively new method in fluid dynamics and other numerical and experimental results. It is concluded that reattachment length increases with increasing Reynolds number and at the same Reynolds number it decreases with increasing channel expansion ratio.

  1. A computer language for reducing activation analysis data

    International Nuclear Information System (INIS)

    Friedman, M.H.; Tanner, J.T.

    1978-01-01

    A program, written in FORTRAN, which defines a language for reducing activation analysis data is described. An attempt was made to optimize the choice of commands and their definitions so as to concisely express what should be done, make the language natural to use and easy to learn, arranqe a system of checks to guard against communication errors and have the language be inclusive. Communications are effected through commands, and these can be given in almost any order. Consistency checks are done and diagnostic messages are printed automatically to guard against the incorrect use of commands. Default options on the commands allow instructions to be expressed concisely while providing a capability to specify details for the data reduction process. The program has been implemented on a UNIVAC 1108 computer. A complete description of the commands, the algorithms used, and the internal consistency checks used are given elsewhere. The applications of the program and the methods for obtaining data automatically have already been described. (T.G.)

  2. Computational analysis on plug-in hybrid electric motorcycle chassis

    Science.gov (United States)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  3. Computational Modelling and Movement Analysis of Hip Joint with Muscles

    Science.gov (United States)

    Siswanto, W. A.; Yoon, C. C.; Salleh, S. Md.; Ngali, M. Z.; Yusup, Eliza M.

    2017-01-01

    In this study, the model of hip joint and the main muscles are modelled by finite elements. The parts included in the model are hip joint, hemi pelvis, gluteus maximus, quadratus femoris and gamellus inferior. The materials that used in these model are isotropic elastic, Mooney Rivlin and Neo-hookean. The hip resultant force of the normal gait and stair climbing are applied on the model of hip joint. The responses of displacement, stress and strain of the muscles are then recorded. FEBio non-linear solver for biomechanics is employed to conduct the simulation of the model of hip joint with muscles. The contact interfaces that used in this model are sliding contact and tied contact. From the analysis results, the gluteus maximus has the maximum displacement, stress and strain in the stair climbing. Quadratus femoris and gamellus inferior has the maximum displacement and strain in the normal gait however the maximum stress in the stair climbing. Besides that, the computational model of hip joint with muscles is produced for research and investigation platform. The model can be used as a visualization platform of hip joint.

  4. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  5. Computational fluid dynamic analysis for independent floating water treatment device

    Science.gov (United States)

    Zawawi, M. H.; Swee, M. G.; Zainal, N. S.; Zahari, N. M.; Kamarudin, M. A.; Ramli, M. Z.

    2017-09-01

    This project is to design and develop 3D Independent Floating Water Treatment Device using 3D CAD software. The device is designed to treat water for better water qualities and water flows of the lakes. A prototype was manufactured to study the water treatment efficiency of the device. Computational Fluid Dynamic (CFD) analysis was used to capture the efficiency of the Independent Floating Water Treatment Device by simulates and model the water flows, pressure and velocity. According to the results, the maximum velocity magnitude was around 1m3/s. The velocity contour showed the device has high velocity at the pipe outlet. The velocity became lower and lower as the distance is further from the pipe outlet. The result from the velocity measurement was 1.05m/s. The pressure magnitude was in between 1426 Pa to 1429 Pa. The laboratory results based on water parameters proved that the water movement and direction of water flow of the Independent Floating Water Treatment Device enable the efficient pollutant removal. The vector plot, velocity contour, water flow path lines, water flow streamline and pressure contour was successful modeled.

  6. Analisis cualitativo asistido por computadora Computer-assisted qualitative analysis

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2003-01-01

    Full Text Available Los objetivos de este ensayo son: por un lado, presentar una aproximación a la experiencia hispanoamericana en el Análisis Cualitativo Asistido por Computadora (ACAC al agrupar mediante un ejercicio de sistematización los trabajos realizados por diversos colegas provenientes de disciplinas afines. Aunque hubiese querido ser exhaustivo y minucioso, como cualquier intento de sistematización de experiencias, en este ejercicio son notables las ausencias y las omisiones. Introducir algunas reflexiones teóricas en torno al papel del ACAC en el desarrollo de la investigación cualitativa a partir de esa sistematización y con particular énfasis en la producción del dato es, por otro lado, objetivo central de esta primera aproximación.The aims of this article are: on the one hand, to present an approximation to the Hispano-American experience on Computer-Assisted Qualitative Data Analysis (CAQDAS, grouping as a systematization exercise the works carried out by several colleagues from related disciplines. Although attempting to be exhaustive and thorough - as in any attempt at systematizing experiences - this exercise presents clear lacks and omissions. On the other hand, to introduce some theoretical reflections about the role played by CAQDAS in the development of qualitative investigation after that systematization, with a specific focus on data generation.

  7. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  8. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  9. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  10. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  11. Thermal maturity patterns in the Ordovician and Devonian of Pennsylvania using conodont color alteration index (CAI) and vitrinite reflectance (%Ro)

    Science.gov (United States)

    Repetski, J.E.; Ryder, R.T.; Harper, J.A.; Trippi, M.H.

    2006-01-01

    This new series of maps enhances previous thermal maturity maps in Pennsylvania by establishing: 1) new subsurface CAI data points for the Ordovician and Devonian and 2) new %Ro and Rock Eval subsurface data points for Middle and Upper Devonian black shale units. Thermal maturity values for the Ordovician and Devonian strata are of major interest because they contain the source rocks for most of the oil and natural gas resources in the basin. Thermal maturity patterns of the Middle Ordovician Trenton Group are evaluated here because they closely approximate those of the overlying Ordovician Utica Shale that is believed to be the source rock for the regional oil and gas accumulation in Lower Silurian sandstones and for natural gas fields in fractured dolomite reservoirs of the Ordovician Black River-Trenton Limestones. Improved CAI-based thermal maturity maps of the Ordovician are important to identify areas of optimum gas generation from the Utica Shale and to provide constraints for interpreting the origin of oil and gas in the Lower Silurian regional accumulation and Ordovician Black River-Trenton fields. Thermal maturity maps of the Devonian will better constrain burial history-petroleum generation models of the Utica Shale, as well as place limitations on the origin of regional oil and gas accumulations in Upper Devonian sandstone and Middle to Upper Devonian black shale.

  12. Thermal maturity patterns (CAI and %R%) in the Ordovician and Devonian rocks of the Appalachian basin in New York State

    Science.gov (United States)

    Weary, David J.; Ryder, Robert T.; Nyahay, Richard

    2000-01-01

    The objective of this study is to enhance existing thermal maturity maps in New York State by establishing: 1) new subsurface CAI data points for the Ordovician and Devonian and 2) new %Ro and Rock Eval subsurface data points for Middle and Upper Devonian black shale units. The thermal maturity of the Ordovician and Devonian rocks is of major interest because they contain the source for most of the unconventional natural gas resources in the basin. Thermal maturity patterns of the Middle Ordovician Trenton Group are evaluated here because they closely approximate those of the overlying Ordovician Utica Shale that is believed to be the source rock for the regional oil and gas accumulation in Lower Silurian sandstones (Jenden and others, 1993; Ryder and others, 1998). Improved CAI-based thermal maturity maps of the Ordovician are important to identify areas of optimum gas generation from the Utica Shale and to provide constraints for interpreting the origin of oil and gas in the Lower Silurian regional accumulation, in particular, its basin-centered part (Ryder, 1998). Thermal maturity maps of the Devonian will better constrain burial history-petroleum generation models of the Utica Shale, as well as place limitations on the origin of regional oil and gas accumulation in Upper Devonian sandstone and Middle to Upper Devonian black shale.

  13. Thermal maturity patterns (CAI and %R) in the Ordovician and Devonian rocks of the Appalachian basin in Pennsylvania

    Science.gov (United States)

    Repetski, John E.; Ryder, Robert T.; Harper, John A.; Trippi, Michael H.

    2002-01-01

    The objective of this study is to enhance existing thermal maturity maps in Pennsylvania by establishing: 1) new subsurface CAI data points for the Ordovician and Devonian and 2) new %Ro and Rock Eval subsurface data points for Middle and Upper Devonian black shale units. Thermal maturity values for the Ordovician and Devonian strata are of major interest because they contain the source rocks for most of the oil and natural gas resources in the basin. Thermal maturity patterns of the Middle Ordovician Trenton Group are evaluated here because they closely approximate those of the overlying Ordovician Utica Shale that is believed to be the source rock for the regional oil and gas accumulation in Lower Silurian sandstones (Ryder and others, 1998) and for natural gas fields in fractured dolomite reservoirs of the Ordovician Black River-Trenton Limestones. Improved CAI-based thermal maturity maps of the Ordovician are important to identify areas of optimum gas generation from the Utica Shale and to provide constraints for interpreting the origin of oil and gas in the Lower Silurian regional accumulation and Ordovician Black River-Trenton fields. Thermal maturity maps of the Devonian will better constrain burial history-petroleum generation models of the Utica Shale, as well as place limitations on the origin of regional oil and gas accumulations in Upper Devonian sandstone and Middle to Upper Devonian black shale.

  14. Thermal maturity patterns (CAI and %Ro) in the Ordovician and Devonian rocks of the Appalachian basin in West Virginia

    Science.gov (United States)

    Repetski, John E.; Ryder, Robert T.; Avary, Katharine Lee; Trippi, Michael H.

    2005-01-01

    The objective of this study is to enhance existing thermal maturity maps in West Virginia by establishing: 1) new subsurface CAI data points for the Ordovician and Devonian and 2) new %Ro and Rock Eval subsurface data points for Middle and Upper Devonian black shale units. Thermal maturity values for the Ordovician and Devonian strata are of major interest because they contain the source rocks for most of the oil and natural gas resources in the basin. Thermal maturity patterns of the Middle Ordovician Trenton Limestone are evaluated here because they closely approximate those of the overlying Ordovician Utica Shale that is believed to be the source rock for the regional oil and gas accumulation in Lower Silurian sandstones (Ryder and others, 1998) and for natural gas fields in fractured dolomite reservoirs of the Ordovician Black River-Trenton Limestones. Improved CAI-based thermal maturity maps of the Ordovician are important to identify areas of optimum gas generation from the Utica Shale and to provide constraints for interpreting the origin of oil and gas in the Lower Silurian regional accumulation and Ordovician Black River-Trenton fields. Thermal maturity maps of the Devonian will better constrain burial history-petroleum generation models of the Utica Shale, as well as place limitations on the origin of regional oil and gas accumulations in Upper Devonian sandstone and Middle to Upper Devonian black shale.

  15. A revista Cais entre o protagonismo e o assistencialismo: Uma análise discursiva crítica

    Directory of Open Access Journals (Sweden)

    Viviane de Melo Resende

    2012-10-01

    Full Text Available Como parte dos resultados de um projeto integrado cujo escopo é investigar, por meio de análises discursivas, as práticas envolvidas na produção e na distribuição de cinco publicações em língua portuguesa voltadas para a situação de rua, este artigo focaliza, com base na Análise de Discurso Crítica, a revista Cais, publicada em Lisboa. Configurando‑se como jornal de rua, a revista é vendida na rua e por pessoas em situação de rua ou de risco, para as quais revertem 70% da venda de cada exemplar. Mais que um meio de comunicação e difusão de problemas sociais, acredita‑se que esse tipo de imprensa proporciona a configuração de posições e relações diferentes, podendo por isso alterar a experiência da exclusão. Neste artigo, tomando como dados excertos de uma entrevista com o seu editor, exploro em que medida se dá a participação de pessoas em situação de rua na produção da revista Cais e na representação desta mesma situação.

  16. Computer-Assisted Instruction in Reading and Language Arts.

    Science.gov (United States)

    Caster, Tonja Root

    A review was conducted of 16 research studies evaluating the effectiveness of computer assisted instruction (CAI) in teaching reading and language arts in the elementary school. The studies were of what K. A. Hall has termed "interactive instruction," which includes drill and practice as well as tutoring. Of the studies reviewed, 13 used at least…

  17. Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.

    Science.gov (United States)

    Edwards, Thomas O.

    The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…

  18. The Effectiveness of a Computer-Assisted Math Learning Program

    Science.gov (United States)

    De Witte, K.; Haelermans, C.; Rogge, N.

    2015-01-01

    Computer-assisted instruction (CAI) programs are considered as a way to improve learning outcomes of students. However, little is known on the schools who implement such programs as well as on the effectiveness of similar information and communication technology programs. We provide a literature review that pays special attention to the existing…

  19. Computers and Instructional Design: Component Display Theory in Transition.

    Science.gov (United States)

    Wilson, Brent G.

    Component display theory (CDT) is used as a working example in this examination of the relationship between instructional design theory and computer assisted instruction (CAI) models. Two basic approaches to instructional design--the analytic and the holistic methods--are reviewed, and four elements of CDT are described: (1) content types,…

  20. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  1. Computational modeling and analysis of the hydrodynamics of human swimming

    Science.gov (United States)

    von Loebbecke, Alfred

    Computational modeling and simulations are used to investigate the hydrodynamics of competitive human swimming. The simulations employ an immersed boundary (IB) solver that allows us to simulate viscous, incompressible, unsteady flow past complex, moving/deforming three-dimensional bodies on stationary Cartesian grids. This study focuses on the hydrodynamics of the "dolphin kick". Three female and two male Olympic level swimmers are used to develop kinematically accurate models of this stroke for the simulations. A simulation of a dolphin undergoing its natural swimming motion is also presented for comparison. CFD enables the calculation of flow variables throughout the domain and over the swimmer's body surface during the entire kick cycle. The feet are responsible for all thrust generation in the dolphin kick. Moreover, it is found that the down-kick (ventral position) produces more thrust than the up-kick. A quantity of interest to the swimming community is the drag of a swimmer in motion (active drag). Accurate estimates of this quantity have been difficult to obtain in experiments but are easily calculated with CFD simulations. Propulsive efficiencies of the human swimmers are found to be in the range of 11% to 30%. The dolphin simulation case has a much higher efficiency of 55%. Investigation of vortex structures in the wake indicate that the down-kick can produce a vortex ring with a jet of accelerated fluid flowing through its center. This vortex ring and the accompanying jet are the primary thrust generating mechanisms in the human dolphin kick. In an attempt to understand the propulsive mechanisms of surface strokes, we have also conducted a computational analysis of two different styles of arm-pulls in the backstroke and the front crawl. These simulations involve only the arm and no air-water interface is included. Two of the four strokes are specifically designed to take advantage of lift-based propulsion by undergoing lateral motions of the hand

  2. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    Science.gov (United States)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  3. A Design of Computer Aided Instructions (CAI) for Undirected Graphs in the Discrete Math Tutorial (DMT). Part 1.

    Science.gov (United States)

    1990-06-01

    void); static void cont2 (void); static void cont3 (void); static void cont4 (void); static void confimn__graph-exit(void); static void nornial-exit...attach [Pagedowni to the cont3 function * setonkey(0x I 00,cont3,0); LILNE WIDTH = 3; cls(1); drawRect(-309,210,310,-200,1 1); drawLine(- 100,-80...34,28); write-horz str(- 1 80,-98 ,Ŕ(-)" ,28); cont3 (); static void cont3 (void) /* attach [Pageup] to the cont2 function * setonkey(0x4900,cont2,0

  4. Computer Models for IRIS Control System Transient Analysis

    International Nuclear Information System (INIS)

    Gary D Storrick; Bojan Petrovic; Luca Oriani

    2007-01-01

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section

  5. Computational Analysis of Dual Radius Circulation Control Airfoils

    Science.gov (United States)

    Lee-Rausch, E. M.; Vatsa, V. N.; Rumsey, C. L.

    2006-01-01

    The goal of the work is to use multiple codes and multiple configurations to provide an assessment of the capability of RANS solvers to predict circulation control dual radius airfoil performance and also to identify key issues associated with the computational predictions of these configurations that can result in discrepancies in the predicted solutions. Solutions were obtained for the Georgia Tech Research Institute (GTRI) dual radius circulation control airfoil and the General Aviation Circulation Control (GACC) dual radius airfoil. For the GTRI-DR airfoil, two-dimensional structured and unstructured grid computations predicted the experimental trend in sectional lift variation with blowing coefficient very well. Good code to code comparisons between the chordwise surface pressure coefficients and the solution streamtraces also indicated that the detailed flow characteristics were matched between the computations. For the GACC-DR airfoil, two-dimensional structured and unstructured grid computations predicted the sectional lift and chordwise pressure distributions accurately at the no blowing condition. However at a moderate blowing coefficient, although the code to code variation was small, the differences between the computations and experiment were significant. Computations were made to investigate the sensitivity of the sectional lift and pressure distributions to some of the experimental and computational parameters, but none of these could entirely account for the differences in the experimental and computational results. Thus, CFD may indeed be adequate as a prediction tool for dual radius CC flows, but limited and difficult to obtain two-dimensional experimental data prevents a confident assessment at this time.

  6. Computational identification and analysis of single-nucleotide ...

    Indian Academy of Sciences (India)

    2School of Computer Science and Information Technology, Devi Ahilya Vishwavidyalaya (DAVV), Indore 452 013, India. 3Computational Biology ... and breeding as genes of scientific and agronomic impor- tance can be isolated solely on ... indels (insertion/deletion) has led to a revolution in their use as molecular markers ...

  7. From handwriting analysis to pen-computer applications

    NARCIS (Netherlands)

    Schomaker, L

    1998-01-01

    In this paper, pen computing, i.e. the use of computers and applications in which the pen is the main input device, will be described from four different viewpoints. Firstly a brief overview of the hardware developments in pen systems is given, leading to the conclusion that the technological

  8. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  9. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  10. Analysis of high-tech methods of illegal remote computer data access

    OpenAIRE

    Polyakov, V. V.; Slobodyan, S. М.

    2007-01-01

    The analysis of high-tech methods of committing crimes in the sphere of computer information has been performed. The crimes were practically committed from remote computers. Virtual traces left at realisation of such methods are revealed. Specific proposals in investigation and prevention of the given type computer entry are developed.

  11. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  12. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  13. Computer assisted sound analysis of arteriovenous fistula in hemodialysis patients.

    Science.gov (United States)

    Malindretos, Pavlos; Liaskos, Christos; Bamidis, Panagiotis; Chryssogonidis, Ioannis; Lasaridis, Anastasios; Nikolaidis, Pavlos

    2014-02-01

    The purpose of this study was to reveal the unique sound characteristics of the bruit produced by arteriovenous fistulae (AVF), using a computerized method. An electronic stethoscope (20 Hz to 20 000 Hz sensitivity) was used, connected to a portable laptop computer. Forty prevalent hemodialysis patients participated in the study. All measurements were made with patients resting in supine position, prior to the initiation of mid-week dialysis session. Standard color Doppler technique was used to estimate blood flow. Clinical examination revealed the surface where the perceived bruit was more intense, and the recording took place at a sample rate of 22 000 Hz in WAV lossless format. Fast Fourier Transform (FFT) mathematical algorithm, was used for the sound analysis. This algorithm is particularly useful in revealing the periodicity of sound data as well as in mapping its frequency behavior and its strength. Produced frequencies were divided into 40 frequency intervals, 250 Hz apart, so that the results would be easier to plot and comprehend. The mean age of the patients was 63.5 ± 14 years; the median time on dialysis was 39.6 months (mean 1 month, max. 200 months). The mean blood flow was 857.7 ± 448.3 ml/min. The mean sound frequency was approximately 5 500 Hz ± 4 000 Hz and the median, which is also expressing the major peak of sound data, was 750 Hz, varying from 250 Hz to 10 000 Hz. A possible limitation of the study is the relatively small number of participants.

  14. Computational identification and analysis of novel sugarcane microRNAs

    Directory of Open Access Journals (Sweden)

    Thiebaut Flávia

    2012-07-01

    Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.

  15. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  16. Impact of Computer-Based Instruction on Attitudes of Students and Instructors: A Review. Final Report.

    Science.gov (United States)

    King, Anne Truscott

    To determine whether contact with computer-assisted instruction (CAI) leads to feelings of "depersonalization" and "dehumanization" a review was conducted of investigations to explore attitudes toward various modes of computer-based instruction before, during, or after exposure. Evaluation of pertinent factors which influenced attitudes was made…

  17. Multimedia Image Technology and Computer Aided Manufacturing Engineering Analysis

    Science.gov (United States)

    Nan, Song

    2018-03-01

    Since the reform and opening up, with the continuous development of science and technology in China, more and more advanced science and technology have emerged under the trend of diversification. Multimedia imaging technology, for example, has a significant and positive impact on computer aided manufacturing engineering in China. From the perspective of scientific and technological advancement and development, the multimedia image technology has a very positive influence on the application and development of computer-aided manufacturing engineering, whether in function or function play. Therefore, this paper mainly starts from the concept of multimedia image technology to analyze the application of multimedia image technology in computer aided manufacturing engineering.

  18. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per

    2016-01-01

    -cost embedded computer with very limited computational resources as compared to an ordinary PC. The system succeeds in counting honeybees, identifying their position and measuring their in-and-out activity. Our algorithm uses background subtraction method to segment the images. After the segmentation stage...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  19. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    bones as seen on X-rays. By placing the X-rays on a digitizer tablet and tracing the outline of the cell system, the area was calculated by the program. The calculated data and traced images could be stored and printed. The program is written in BASIC; necessary hardware is an IBM-compatible personal......Planimetrical measurements are made to calculate the area of an entity. By digitizing the entity the planimetrical measurements may be done by computer. This computer program was developed in conjunction with a research project involving measurement of the pneumatized cell system of the temporal...... computer, a digitizer tablet and a printer....

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  1. CAI in a School for the Deaf: Expeded Results and a Serendipity or Two.

    Science.gov (United States)

    Fricke, James E.

    In September 1975, the Computer Curriculum Corporation's computer assisted instruction program was instituted at the Scranton State School for the Deaf in Scranton, Pennsylvania. A minicomputer and 20 teletype terminals were installed. Drill and practice programs in elementary level math, reading and language arts were initiated. Teachers'…

  2. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  3. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  4. Discrete calculus applied analysis on graphs for computational science

    CERN Document Server

    Grady, Leo J

    2010-01-01

    This unique text brings together into a single framework current research in the three areas of discrete calculus, complex networks, and algorithmic content extraction. Many example applications from several fields of computational science are provided.

  5. Cost/Benefit Analysis of Leasing Versus Purchasing Computers

    National Research Council Canada - National Science Library

    Arceneaux, Alan

    1997-01-01

    .... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...

  6. Sensitivity Analysis and Error Control for Computational Aeroelasticity, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposal is the development of a next-generation computational aeroelasticity code, suitable for real-world complex geometries, and...

  7. Export Control of High Performance Computing: Analysis and Alternative Strategies

    National Research Council Canada - National Science Library

    Holland, Charles

    2001-01-01

    High performance computing has historically played an important role in the ability of the United States to develop and deploy a wide range of national security capabilities, such as stealth aircraft...

  8. Cloud computing for genomic data analysis and collaboration.

    Science.gov (United States)

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  9. Automated computation of autonomous spectral submanifolds for nonlinear modal analysis

    Science.gov (United States)

    Ponsioen, Sten; Pedergnana, Tiemo; Haller, George

    2018-04-01

    We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.

  10. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  11. Computational analysis of difenoconazole interaction with soil chitinases

    International Nuclear Information System (INIS)

    Vlǎdoiu, D L; Filimon, M N; Ostafe, V; Isvoran, A

    2015-01-01

    This study focusses on the investigation of the potential binding of the fungicide difenoconazole to soil chitinases using a computational approach. Computational characterization of the substrate binding sites of Serratia marcescens and Bacillus cereus chitinases using Fpocket tool reflects the role of hydrophobic residues for the substrate binding and the high local hydrophobic density of both sites. Molecular docking study reveals that difenoconazole is able to bind to Serratia marcescens and Bacillus cereus chitinases active sites, the binding energies being comparable

  12. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  13. Analysis of Sci-Hub downloads of computer science papers

    Directory of Open Access Journals (Sweden)

    Andročec Darko

    2017-07-01

    Full Text Available The scientific knowledge is disseminated by research papers. Most of the research literature is copyrighted by publishers and avail- able only through paywalls. Recently, some websites offer most of the recent content for free. One of them is the controversial website Sci-Hub that enables access to more than 47 million pirated research papers. In April 2016, Science Magazine published an article on Sci-Hub activity over the period of six months and publicly released the Sci-Hub’s server log data. The mentioned paper aggregates the view that relies on all downloads and for all fields of study, but these findings might be hiding interesting patterns within computer science. The mentioned Sci-Hub log data was used in this paper to analyse downloads of computer science papers based on DBLP’s list of computer science publications. The top downloads of computer science papers were analysed, together with the geographical location of Sci-Hub users, the most downloaded publishers, types of papers downloaded, and downloads of computer science papers per publication year. The results of this research can be used to improve legal access to the most relevant scientific repositories or journals for the computer science field.

  14. Computer Analysis of Air Pollution from Highways, Streets, and Complex Interchanges

    Science.gov (United States)

    1974-03-01

    A detailed computer analysis of air quality for a complex highway interchange was prepared, using an in-house version of the Environmental Protection Agency's Gaussian Highway Line Source Model. This analysis showed that the levels of air pollution n...

  15. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  16. Analysis of cerebral infarction pattern in computed tomography images of patients with internal carotid artery stenosis

    NARCIS (Netherlands)

    Jongen, Cynthia; Nederkoorn, Paul J.; Niessen, Wiro J.; Pluim, Josien P. W.

    2004-01-01

    An unbiased and quantitative analysis of lesion patterns in patient groups is described and applied to the analysis of infarction patterns. One hundred forty-two computed tomographic images of patients with ischemic stroke were registered to an average computed tomographic brain image, which was

  17. Computer-Aided Interval Change Analysis of Microcalifications on Management for Breast Cancer Detection

    Science.gov (United States)

    2006-07-01

    of Microcalcifications on Mammograms for Breast Cancer Detection PRINCIPAL INVESTIGATOR: Lubomir Hadjiiski, Ph.D...Computer-Aided Interval Change Analysis of Microcalcifications on Mammograms for 5a. CONTRACT NUMBER Breast Cancer Detection 5b. GRANT NUMBER DAMD17...CAD(p=0.04). 15. SUBJECT TERMS Breast Cancer , Computer-aided diagnosis, Screening, Classification, Image Analysis 16. SECURITY CLASSIFICATION OF

  18. Nondestructive analysis of urinary calculi using micro computed tomography

    Directory of Open Access Journals (Sweden)

    Lingeman James E

    2004-12-01

    Full Text Available Abstract Background Micro computed tomography (micro CT has been shown to provide exceptionally high quality imaging of the fine structural detail within urinary calculi. We tested the idea that micro CT might also be used to identify the mineral composition of urinary stones non-destructively. Methods Micro CT x-ray attenuation values were measured for mineral that was positively identified by infrared microspectroscopy (FT-IR. To do this, human urinary stones were sectioned with a diamond wire saw. The cut surface was explored by FT-IR and regions of pure mineral were evaluated by micro CT to correlate x-ray attenuation values with mineral content. Additionally, intact stones were imaged with micro CT to visualize internal morphology and map the distribution of specific mineral components in 3-D. Results Micro CT images taken just beneath the cut surface of urinary stones showed excellent resolution of structural detail that could be correlated with structure visible in the optical image mode of FT-IR. Regions of pure mineral were not difficult to find by FT-IR for most stones and such regions could be localized on micro CT images of the cut surface. This was not true, however, for two brushite stones tested; in these, brushite was closely intermixed with calcium oxalate. Micro CT x-ray attenuation values were collected for six minerals that could be found in regions that appeared to be pure, including uric acid (3515 – 4995 micro CT attenuation units, AU, struvite (7242 – 7969 AU, cystine (8619 – 9921 AU, calcium oxalate dihydrate (13815 – 15797 AU, calcium oxalate monohydrate (16297 – 18449 AU, and hydroxyapatite (21144 – 23121 AU. These AU values did not overlap. Analysis of intact stones showed excellent resolution of structural detail and could discriminate multiple mineral types within heterogeneous stones. Conclusions Micro CT gives excellent structural detail of urinary stones, and these results demonstrate the feasibility

  19. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  20. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  1. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  4. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  5. The Plato IV CAI System: Where Is It Now? Where Can It Go?

    Science.gov (United States)

    Eastwood, Lester F.; Ballard, Richard J.

    1975-01-01

    This article studies each component of the PLATO system--the computer hardware, the courseware, the student terminal, and the distribution network. The discussions of each component point out cost-determining factors. (Author)

  6. Secure distributed genome analysis for GWAS and sequence comparison computation

    Science.gov (United States)

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  7. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  8. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...... structural images for heavy minerals based on irregularly sampled geochemical data. This methodology has proven useful in producing images that reflect real geological structures with potential application in mineral exploration. A method for removing loboratory-produced map-sheet patterns in spatial data......Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...

  9. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  10. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  11. Heat exchanger performance analysis programs for the personal computer

    International Nuclear Information System (INIS)

    Putman, R.E.

    1992-01-01

    Numerous utility industry heat exchange calculations are repetitive and thus lend themselves to being performed on a Personal Computer. These programs may be regarded as engineering tools which, when put together, can form a Toolbox. However, the practicing Results Engineer in the utility industry desires not only programs that are robust as well as easy to use but can also be used both on desktop and laptop PC's. The latter also offer the opportunity to take the computer into the plant or control room, and use it there to process test or operating data right on the spot. Most programs evolve through the needs which arise in the course of day-to-day work. This paper describes several of the more useful programs of this type and outlines some of the guidelines to be followed when designing personal computer programs for use by the practicing Results Engineer

  12. Radiographic test phantom for computed tomographic lung nodule analysis

    International Nuclear Information System (INIS)

    Zerhouni, E.A.

    1987-01-01

    This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs

  13. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  14. Identidades de pescadores caiçaras: heroísmo e precariedade em populações tradicionais?

    Directory of Open Access Journals (Sweden)

    Nancy Ramacciotti de Oliveira-Monteiro

    2017-08-01

    Full Text Available From an Eriksonian perspective, psychosocial identity manifests itself in everyday occupational, family and intersocial relations. Changes resulting from ecosystemical interactions involving nature, society and technology take place in peculiar rhythms and ranges, in different areas of contemporary human work. Research shows artisanal fishing is impacted by these changes. In order to study aspects of identity in the psychosocial profiles of artisanal fishermen, three caiçaras fishermen from Baixada Santista (SP were investigated through open interviews, questionnaires and participant observation. Using qualitative methodology, the results were systematized indicating enhancement of the profession / craft (the hero fisherman that faces major challenges from nature and attributes of a worker involved in precariousness (harm to health and livelihood, with few resources from formal education and power to face the emerging challenges of contemporary political interactions. The data was discussed in controversial perspectives of understanding on the artisanal fishermen communities as traditional populations, in their coexistence with the modernization processes of urban culture.

  15. Perceptions on hospitality when visiting secluded communities of guaranis, caiçaras e quilombolas in Paraty region

    Directory of Open Access Journals (Sweden)

    Luis Alberto Beares

    2008-10-01

    Full Text Available Tourism in secluded communities puts different cultures in contact with each other and must be handled carefully not to cause environmental damage as well as cultural loss which might jeopardize the local development and create hostile relationships. The proposal of in sito tourism, considering the local memory and patrimony as a hospitality potential, was observed during technical visitations to three communities located in the Paraty region and surroundings: Guarani, Caiçara (fishermen and Quilombola(African slaves descendants. Through field work involving visitations to communities and interviews with locals, information regarding cultural differences and the importance of the land occupation in the history of each of the communities was assessed. The common link in the history of these peoples is the struggle for the right of land possession. During visits when people shared their territory various forms of hospitality in each community were verified, issued from different cultures and cultural values.

  16. Cais da memória: espaço para vivências afetivas com a cidade de Natal

    OpenAIRE

    Vasconcelos Neto, Francisco Rocha

    2016-01-01

    A pesquisa de mestrado procura evidenciar e contribuir para a importância da construção de espaços, com ênfase na urbanidade, a partir da criação de um projeto de arquitetura que integra e redesenha a malha urbana por meio da implantação de um edifício. O trabalho consiste no desenvolvimento do anteprojeto de arquitetura do Cais da Memória, um complexo arquitetônico para vivências afetivas com o ambiente de Natal, que se dariam por meio da ocupação e deslocamento pelo território da cidade por...

  17. ANÁLISE PROBABILÍSTICA DAS REAÇÕES NAS ESTACAS DE UM CAIS DE CONTÊINERES

    OpenAIRE

    Ramos, André Pereira; Lima, João Paulo Silva; Real, Mauro Vasconcellos

    2017-01-01

    Resumo. Neste trabalho aplicou-se o método de simulações de Monte Carlo ao modelo estrutural de um cais de contêineres com o objetivo de verificar as distribuições de probabilidade das reações máximas da estrutura. Duas combinações de carregamentos externos atuantes foram apresentadas, ambas simulando condições reais de funcionamento do terminal portuário. Aos parâmetros de entrada das simulações – ações externas devidasao peso próprio, sobrecarga, forças de amarração e equipamentos – foram a...

  18. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Zebeljan, Dj.; Cizelj, L.

    1990-01-01

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  19. Computer codes for beam dynamics analysis of cyclotronlike accelerators

    Science.gov (United States)

    Smirnov, V.

    2017-12-01

    Computer codes suitable for the study of beam dynamics in cyclotronlike (classical and isochronous cyclotrons, synchrocyclotrons, and fixed field alternating gradient) accelerators are reviewed. Computer modeling of cyclotron segments, such as the central zone, acceleration region, and extraction system is considered. The author does not claim to give a full and detailed description of the methods and algorithms used in the codes. Special attention is paid to the codes already proven and confirmed at the existing accelerating facilities. The description of the programs prepared in the worldwide known accelerator centers is provided. The basic features of the programs available to users and limitations of their applicability are described.

  20. Dietary exposure to aflatoxin B-1, ochratoxin A and fuminisins of adults in Lao Cai province, Viet Nam: A total dietary study approach

    DEFF Research Database (Denmark)

    Bui, Huong Mai; Le Danh Tuyen; Do Huu Tuan

    2016-01-01

    Aflatoxins, fumonisins and ochratoxin A that contaminate various agricultural commodities are considered of significant toxicity and potent human carcinogens. This study took a total dietary study approach and estimated the dietary exposure of these mycotoxins for adults living in Lao Cai province...

  1. Ensiled and dry cassava leaves, and sweet potato vines as a protein source in diets for growing Vietnamese large white Mong Cai pigs

    NARCIS (Netherlands)

    Nguyen, T.H.L.; Le, N.G.; Verstegen, M.W.A.; Hendriks, W.H.

    2010-01-01

    The aim of the present study was to evaluate the effects of replacing 70% of the protein from fish meal by protein from ensiled or dry cassava leaves and sweet potato vines on the performance and carcass characters of growing F1 (Large White¿Mong Cai) pigs in Central Vietnam. Twenty-five crossbred

  2. Introduction to Numerical Computation - analysis and Matlab illustrations

    DEFF Research Database (Denmark)

    Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun

    In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...... are illustrated by examples in MATLAB....

  3. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link ...

  4. Microscopes and computers combined for analysis of chromosomes

    Science.gov (United States)

    Butler, J. W.; Butler, M. K.; Stroud, A. N.

    1969-01-01

    Scanning machine CHLOE, developed for photographic use, is combined with a digital computer to obtain quantitative and statistically significant data on chromosome shapes, distribution, density, and pairing. CHLOE permits data acquisition about a chromosome complement to be obtained two times faster than by manual pairing.

  5. Interval change analysis to improve computer aided detection in mammography.

    NARCIS (Netherlands)

    Timp, S.; Karssemeijer, N.

    2006-01-01

    We are developing computer aided diagnosis (CAD) techniques to study interval changes between two consecutive mammographic screening rounds. We have previously developed methods for the detection of malignant masses based on features extracted from single mammographic views. The goal of the present

  6. Hierarchical nanoreinforced composites: Computational analysis of damage mechanisms

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Pontefisso, Alessandro; Dai, Gaoming

    2016-01-01

    The potential of hierarchical composites with secondary nanoreinforcement is discussed and analysed on the basis of the computational modelling. The concept of nanostructuring of interfaces as an important reserve of the improvement of the composite properties is discussed. The influence of distr...

  7. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  8. An Analysis of Attitudes toward Computer Networks and Internet Addiction.

    Science.gov (United States)

    Tsai, Chin-Chung; Lin, Sunny S. J.

    The purpose of this study was to explore the interplay between young people's attitudes toward computer networks and Internet addiction. After analyzing questionnaire responses of an initial sample of 615 Taiwanese high school students, 78 subjects, viewed as possible Internet addicts, were selected for further explorations. It was found that…

  9. Computer-aided analysis of grain growth in metals

    DEFF Research Database (Denmark)

    Klimanek, P.; May, C.; Richter, H.

    1993-01-01

    Isothermal grain growth in aluminium, copper and alpha-iron was investigated experimentally at elevated temperatures and quantitatively interpreted by computer simulation on the base of a statistical model described in [4,5,6]. As it is demonstrated for the grain growth kinetics, the experimental...

  10. Computational analysis of frictional drag over transverse grooved ...

    African Journals Online (AJOL)

    user

    International Journal of Engineering, Science and Technology. Vol. 3, No. 2, 2011, pp. 110-116. INTERNATIONAL. JOURNAL OF .... dissipation) that is adopted by the standard wall function in computing the budget of the turbulence kinetic energy at wall- neighboring cells [3]. Figure 1. Schematic diagram of the grooved ...

  11. Computational Auditory Scene Analysis Based Perceptual and Neural Principles

    National Research Council Canada - National Science Library

    Wang, DeLiang

    2004-01-01

    .... This fundamental process of auditory perception is called auditory scene analysis. of particular importance in auditory scene analysis is the separation of speech from interfering sounds, or speech segregation...

  12. Development of computer software for pavement life cycle cost analysis.

    Science.gov (United States)

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  13. Optical computing for image bandwidth compression: analysis and simulation.

    Science.gov (United States)

    Hunt, B R

    1978-09-15

    Image bandwidth compression is dominated by digital methods for carrying out the required computations. This paper discusses the general problem of using optics to realize the computations in bandwidth compression. A common method of digital bandwidth compression, feedback differential pulse code modulation (DPCM), is reviewed, and the obstacles to making a direct optical analogy to feedback DPCM are discussed. Instead of a direct optical analogy to DPCM, an optical system which captures the essential features of DPCM without optical feedback is introduced. The essential features of this incoherent optical system are encoding of low-frequency information and generation of difference samples which can be coded with a small number of bits. A simulation of this optical system by means of digital image processing is presented, and performance data are also included.

  14. Computational design and analysis of flatback airfoil wind tunnel experiment.

    Energy Technology Data Exchange (ETDEWEB)

    Mayda, Edward A. (University of California, Davis, CA); van Dam, C.P. (University of California, Davis, CA); Chao, David D. (University of California, Davis, CA); Berg, Dale E.

    2008-03-01

    A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

  15. Computational solutions to large-scale data management and analysis.

    Science.gov (United States)

    Schadt, Eric E; Linderman, Michael D; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P

    2010-09-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist - such as cloud and heterogeneous computing - to successfully tackle our big data problems.

  16. Computer assisted analysis of hand radiographs in infantile hypophosphatasia carriers

    International Nuclear Information System (INIS)

    Chodirker, B.N.; Greenberg, C.R.; Manitoba Univ., Winnipeg, MB; Roy, D.; Cheang, M.; Evans, J.A.; Manitoba Univ., Winnipeg, MB; Manitoba Univ., Winnipeg, MB; Reed, M.H.; Manitoba Univ., Winnipeg, MB

    1991-01-01

    Hand radiographs of 49 carriers of infantile hypophosphatasia and 67 non-carriers were evaluated using two Apple IIe Computer Programs and Apple Graphics Tablet. CAMPS was used to determine the bone lengths and calculate the metacarpophalangeal profiles. A newly developed program (ADAM) was used to determine bone density based on percent cortical area of the second metacarpal. Carriers of infantile hypophosphatasia had significantly less dense bones. (orig.)

  17. Computer Science Papers in Web of Science: A Bibliometric Analysis

    OpenAIRE

    Dalibor Fiala; Gabriel Tutoky

    2017-01-01

    In this article we present a bibliometric study of 1.9 million computer science papers published from 1945 to 2014 and indexed in Web of Science. We analyze both the quantity and the impact of these publications according to document types, languages, disciplines, countries, institutions, and publication sources. The most frequent author keywords, cited references, and cited papers as well as the distribution of the number of references and citations per paper and of the age of cited referenc...

  18. Analysis of Computer Experiments with Multiple Noise Sources

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    2010-01-01

    In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled eff...... effectively with linear mixed effects models and generalized additive models. Copyright (C) 2009 John Wiley & Sons, Ltd....

  19. Comparison of two three-dimensional cephalometric analysis computer software

    OpenAIRE

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  20. Reliability analysis of Airbus A-330 computer flight management system

    OpenAIRE

    Fajmut, Metod

    2010-01-01

    Diploma thesis deals with digitized, computerized flight control system »Fly-by-wire« and security aspects of the computer system of an aircraft Airbus A330. As for space and military aircraft structures is also in commercial airplanes, much of the financial contribution devoted to reliability. Conventional aircraft control systems have, and some are still, to rely on mechanical and hydraulic connections between the controls on aircraft operated by the pilot and control surfaces. But newer a...

  1. Computer programs for helicopter high speed flight analysis

    OpenAIRE

    Carmona, Waldo F.

    1983-01-01

    Approved for Public Release, Distribution Unlimited This report gives the user of HP41-CV handheld programmable calculator or the IBM 3033 calculator, a blade element method for calculating the total power required in forward, straight and level high speed flight for an isolated rotor. The computer programs consist of a main program which calculates the necessary dynamic parameters of the main rotor and several subroutines which calculate power required as well as maximum...

  2. The analysis of one-dimensional reactor kinetics benchmark computations

    International Nuclear Information System (INIS)

    Sidell, J.

    1975-11-01

    During March 1973 the European American Committee on Reactor Physics proposed a series of simple one-dimensional reactor kinetics problems, with the intention of comparing the relative efficiencies of the numerical methods employed in various codes, which are currently in use in many national laboratories. This report reviews the contributions submitted to this benchmark exercise and attempts to assess the relative merits and drawbacks of the various theoretical and computer methods. (author)

  3. Analysis of control room computers at nuclear power plants

    International Nuclear Information System (INIS)

    Leijonhufvud, S.; Lindholm, L.

    1984-03-01

    The following problems are analyzed: - the developing of a system - hardware and software - data - the aquisition of the system - operation and service. The findings are: - most reliability problems can be solved by doubling critical units - reliability in software has a quality that can only be created through development - reliability in computer systems in extremely unusual situations can not be quantified or verified, except possibly for very small and functionally simple systems - to attain the highest possible reliability by such simple systems these have to: - contian one or very few functions - be functionally simple - be application-transparent, viz. the internal function of the system should be independent of the status of the process - a computer system will compete succesfully with other possible systems regarding reliability for the following reasons: - if the function is simple enough for other systems, the dator system would be small - if the functions cannot be realized by other systems - the computer system would complement the human effort - and the man-machine system would be a better solution than no system, possibly better than human function only. (Aa)

  4. ELECTRONIC EVIDENCE IN THE JUDICIAL PROCEEDINGS AND COMPUTER FORENSIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2017-01-01

    Full Text Available Today’s perspective of the information society is characterized by the terminology of modern dictionaries of globalization including the terms such as convergence, digitization (media, technology and/or telecommunications and mobility of people or technology. Each word with progress, development, a positive sign of the rise of the information society. On the other hand in a virtual environment traditional evidence in judicial proceedings with the document on paper substrate, are becoming electronic evidence, and their management processes and criteria for admissibility are changing over traditional evidence. The rapid growth of computer data created new opportunities and the growth of new forms of computing, and cyber crime, but also the new ways of proof in court cases, which were unavailable just a few decades. The authors of this paper describe new trends in the development of the information society and the emergence of electronic evidence, with emphasis on the impact of the development of computer crime on electronic evidence; the concept, legal regulation and probative value of electronic evidence, and in particular of electronic documents; and the issue of electronic evidence expertise and electronic documents in court proceedings.

  5. Mathematical modellings and computational methods for structural analysis of LMFBR's

    International Nuclear Information System (INIS)

    Liu, W.K.; Lam, D.

    1983-01-01

    In this paper, two aspects of nuclear reactor problems are discussed, modelling techniques and computational methods for large scale linear and nonlinear analyses of LMFBRs. For nonlinear fluid-structure interaction problem with large deformation, arbitrary Lagrangian-Eulerian description is applicable. For certain linear fluid-structure interaction problem, the structural response spectrum can be found via 'added mass' approach. In a sense, the fluid inertia is accounted by a mass matrix added to the structural mass. The fluid/structural modes of certain fluid-structure problem can be uncoupled to get the reduced added mass. The advantage of this approach is that it can account for the many repeated structures of nuclear reactor. In regard to nonlinear dynamic problem, the coupled nonlinear fluid-structure equations usually have to be solved by direct time integration. The computation can be very expensive and time consuming for nonlinear problems. Thus, it is desirable to optimize the accuracy and computation effort by using implicit-explicit mixed time integration method. (orig.)

  6. CASKETSS: a computer code system for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1989-02-01

    A computer program CASKETSS has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS measn a modular code system for CASK Evaluation code system Thermal and Structural Safety. Main features of CASKETSS are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) Some of the computer programs in the code system has been programmed to provide near optimal speed on vector processing computers. (3) Data libralies fro thermal and structural analysis are provided in the code system. (4) Input data generator is provided in the code system. (5) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  7. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  8. A computer program for structural analysis of fuel elements

    International Nuclear Information System (INIS)

    Hayashi, I.M.V.; Perrotta, J.A.

    1988-01-01

    It's presented the code ELCOM for the matrix analysis of tubular structures coupled by rigid spacers, typical of PWR's fuel elements. The code ELCOM makes a static structural analysis, where the displacements and internal forces are obtained for each structure at the joints with the spacers, and also, the natural frequencies and vibrational modes of an equivalent integrated structure are obtained. The ELCOM result is compared to a PWR fuel element structural analysis obtained in published paper. (author) [pt

  9. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  10. The Effect of Prior Experience with Computers, Statistical Self-Efficacy, and Computer Anxiety on Students' Achievement in an Introductory Statistics Course: A Partial Least Squares Path Analysis

    Science.gov (United States)

    Abd-El-Fattah, Sabry M.

    2005-01-01

    A Partial Least Squares Path Analysis technique was used to test the effect of students' prior experience with computers, statistical self-efficacy, and computer anxiety on their achievement in an introductory statistics course. Computer Anxiety Rating Scale and Current Statistics Self-Efficacy Scale were administered to a sample of 64 first-year…

  11. GIANT: a computer code for General Interactive ANalysis of Trajectories

    International Nuclear Information System (INIS)

    Jaeger, J.; Lee, M.; Servranckx, R.; Shoaee, H.

    1985-04-01

    Many model-driven diagnostic and correction procedures have been developed at SLAC for the on-line computer controlled operation of SPEAR, PEP, the LINAC, and the Electron Damping Ring. In order to facilitate future applications and enhancements, these procedures are being collected into a single program, GIANT. The program allows interactive diagnosis as well as performance optimization of any beam transport line or circular machine. The test systems for GIANT are those of the SLC project. The organization of this program and some of the recent applications of the procedures will be described in this paper

  12. Routing performance analysis and optimization within a massively parallel computer

    Science.gov (United States)

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  13. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  14. Computational Fluid Dynamic Analysis of the Tesla Valve

    OpenAIRE

    Eidesen, Hans-Kristian; Khawaja, Hassan Abbas

    2016-01-01

    Poster from The International Conference of Multiphysics, held in Zurich by The International Society of Multiphysics. 08.12.16 - 09.12.16. Serbian-born inventor Nikola Tesla invented the Tesla valve (Tesla's Valvular Conduit) and patented it in 1919. The Tesla valve is unique in the sense that it does not have any moving parts, but it can work as a one-way valve. Nikola Tesla invented this unit without advanced mathematical models, nor with the help of modern computing power. The original...

  15. A Trend Analysis of Computer Literacy Skills of Preservice Teachers During Six Academic Years.

    Science.gov (United States)

    Sheffield, Caryl J.

    1998-01-01

    Analyzes trends in computer-literacy skills of preservice teachers during the period 1991/92 to 1996/97. A significant linear pattern of increasing means was found in word processing, spreadsheet, hardware, operating system software, and the mouse. Analysis provides a perspective on how increasing access to computers in high school translates into…

  16. Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis

    Science.gov (United States)

    Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara

    2014-01-01

    This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…

  17. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  18. A Meta-Analysis of Effectiveness Studies on Computer Technology-Supported Language Learning

    Science.gov (United States)

    Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.

    2013-01-01

    With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…

  19. Analysis of Introducing Active Learning Methodologies in a Basic Computer Architecture Course

    Science.gov (United States)

    Arbelaitz, Olatz; José I. Martín; Muguerza, Javier

    2015-01-01

    This paper presents an analysis of introducing active methodologies in the Computer Architecture course taught in the second year of the Computer Engineering Bachelor's degree program at the University of the Basque Country (UPV/EHU), Spain. The paper reports the experience from three academic years, 2011-2012, 2012-2013, and 2013-2014, in which…

  20. Stability Analysis of Learning Algorithms for Ontology Similarity Computation

    Directory of Open Access Journals (Sweden)

    Wei Gao

    2013-01-01

    Full Text Available Ontology, as a useful tool, is widely applied in lots of areas such as social science, computer science, and medical science. Ontology concept similarity calculation is the key part of the algorithms in these applications. A recent approach is to make use of similarity between vertices on ontology graphs. It is, instead of pairwise computations, based on a function that maps the vertex set of an ontology graph to real numbers. In order to obtain this, the ranking learning problem plays an important and essential role, especially k-partite ranking algorithm, which is suitable for solving some ontology problems. A ranking function is usually used to map the vertices of an ontology graph to numbers and assign ranks of the vertices through their scores. Through studying a training sample, such a function can be learned. It contains a subset of vertices of the ontology graph. A good ranking function means small ranking mistakes and good stability. For ranking algorithms, which are in a well-stable state, we study generalization bounds via some concepts of algorithmic stability. We also find that kernel-based ranking algorithms stated as regularization schemes in reproducing kernel Hilbert spaces satisfy stability conditions and have great generalization abilities.