WorldWideScience

Sample records for analysis cai computer

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    Energy Technology Data Exchange (ETDEWEB)

    Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Computers for Your Classroom: CAI and CMI.

    Science.gov (United States)

    Thomas, David B.; Bozeman, William C.

    1981-01-01

    The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…

  3. Generative Computer Assisted Instruction: An Application of Artificial Intelligence to CAI.

    Science.gov (United States)

    Koffman, Elliot B.

    Frame-oriented computer-assisted instruction (CAI) systems dominate the field, but these mechanized programed texts utilize the computational power of the computer to a minimal degree and are difficult to modify. Newer, generative CAI systems which are supplied with a knowledge of subject matter can generate their own problems and solutions, can…

  4. Perancangan Perangkat Lunak Media Pembelajaran Menggunakan Computer Assisted Instruction (CAI untuk Pembelajaran Ilmu Tajwid Berbasis Web

    Directory of Open Access Journals (Sweden)

    Fenny Purwani

    2016-03-01

    Full Text Available Strategi penggunaan Computer Assisted Instruction (CAI sebagai media pembelajaran dibutuhkan untuk mengatasi permasalahan yang muncul dalam proses pembelajaran. Pembelajaran yang dikemas dengan baik memberikan dampak yang positif dalam memajukan potensi pada diri manusia. CAI sebagai media pembelajaran berbasis computer dibangun sebagai pelengkap dan pendukung metode pembelajaran yang selama ini hanya menggunakan metode ceramah, diskusi informasi dan demonstrasi. Tujuan penelitian ini adalah merancang dan membangun media pembelajaran CAI yang interaktif dengan berbasis Web. Kemudian hasilnya berupa rancangan CAI dengan model tutorial, serta dilengkapi dengan latihan soal-soal dari materi yang diberikan. Perancangan CAI ini kemudian digunakan untuk media pembelajaran ilmu Tajwid dengan komputer. Strategic use of Computer Assisted Instruction (CAI as a learning media needed to overcome the problems that appeared in the learning process. Learning that was packaged well gave a positive impact in advancing the potential in human beings. CAI as a computer-based learning media was built to complement and support the learning method which as long as only used the speech, discussions, information and demonstrations method. The purpose of this study was to design and build learning media of CAI which was interactive with Web-based. Then the result was a design of CAI with tutorial model and completed with practicing questions from the material provided. This CAI design was then used for learning media of Tajwid with computer.

  5. Rancangan Perangkat Lunak Computer Assisted Instruction (CAI Untuk Ilmu Tajwid Berbasis Web

    Directory of Open Access Journals (Sweden)

    Fenny Purwani

    2015-08-01

    Full Text Available The development of information technology and science refer to the need of teching-learning concept and mechanism wich are based on information technology, undoubtedly. Regarding the development, it needs qualified human resources and flexible material changing and it should be appropriate with technology and science development. Additionaly, this combines between education based on religious and techology (IMTAK and IPTEK. Internet techology can be used as teaching tool which is known as Computer Assisted Intruction (CAI. CAI software might be one of media or tool in learnig tajwid and it can help people to learn Tajwid easier.

  6. Numerical simulation and validation of SI-CAI hybrid combustion in a CAI/HCCI gasoline engine

    Science.gov (United States)

    Wang, Xinyan; Xie, Hui; Xie, Liyan; Zhang, Lianfang; Li, Le; Chen, Tao; Zhao, Hua

    2013-02-01

    SI-CAI hybrid combustion, also known as spark-assisted compression ignition (SACI), is a promising concept to extend the operating range of CAI (Controlled Auto-Ignition) and achieve the smooth transition between spark ignition (SI) and CAI in the gasoline engine. In this study, a SI-CAI hybrid combustion model (HCM) has been constructed on the basis of the 3-Zones Extended Coherent Flame Model (ECFM3Z). An ignition model is included to initiate the ECFM3Z calculation and induce the flame propagation. In order to precisely depict the subsequent auto-ignition process of the unburned fuel and air mixture independently after the initiation of flame propagation, the tabulated chemistry concept is adopted to describe the auto-ignition chemistry. The methodology for extracting tabulated parameters from the chemical kinetics calculations is developed so that both cool flame reactions and main auto-ignition combustion can be well captured under a wider range of thermodynamic conditions. The SI-CAI hybrid combustion model (HCM) is then applied in the three-dimensional computational fluid dynamics (3-D CFD) engine simulation. The simulation results are compared with the experimental data obtained from a single cylinder VVA engine. The detailed analysis of the simulations demonstrates that the SI-CAI hybrid combustion process is characterised with the early flame propagation and subsequent multi-site auto-ignition around the main flame front, which is consistent with the optical results reported by other researchers. Besides, the systematic study of the in-cylinder condition reveals the influence mechanism of the early flame propagation on the subsequent auto-ignition.

  7. The Instructional Use of CAI in the Education of the Mentally Retarded.

    Science.gov (United States)

    Winters, John J., Jr.; And Others

    Computer assisted instruction (CAI) studies with the mentally retarded in the United States and Canada reveal that the retarded benefit from CAI in academic and social skills. Their learning is enhanced to the same extent as that of the nonretarded. CAI can be cost-effective, especially with the reduced costs of mini and micro-computers; however,…

  8. CAI: Overcoming Attitude Barriers.

    Science.gov (United States)

    Netusil, Anton J.; Kockler, Lois H.

    During each of two school quarters, approximately 60 college students enrolled in a mathematics course were randomly assigned to an experimental group or a control group. The control group received instruction by the lecture method only; the experimental group received the same instruction, except that six computer-assisted instruction (CAI) units…

  9. An Object-Oriented Architecture for a Web-Based CAI System.

    Science.gov (United States)

    Nakabayashi, Kiyoshi; Hoshide, Takahide; Seshimo, Hitoshi; Fukuhara, Yoshimi

    This paper describes the design and implementation of an object-oriented World Wide Web-based CAI (Computer-Assisted Instruction) system. The goal of the design is to provide a flexible CAI/ITS (Intelligent Tutoring System) framework with full extendibility and reusability, as well as to exploit Web-based software technologies such as JAVA, ASP (a…

  10. CAI多媒體教學軟體之開發模式 Using an Instructional Design Model for Developing a Multimedia CAI Courseware

    Directory of Open Access Journals (Sweden)

    Hsin-Yih Shyu

    1995-09-01

    Full Text Available 無This article outlined a systematic instructional design model for developing a multimedia computer-aided instruction (CAI courseware. The model illustrated roles and tasks as two dimensions necessary in a CAI production teamwork. Four major components (Analysis, Design, Development, and Revise/Evaluation following by totally 25 steps are provided. Eight roles with each competent skills were identified. The model will be useful in serving as a framework for developing a mulrimedia CAI courseware for educators, instructional designers and CAI industry developers.

  11. The Vibrio cholerae quorum-sensing autoinducer CAI-1: analysis of the biosynthetic enzyme CqsA

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, R.; Bolitho, M; Higgins, D; Lu, W; Ng, W; Jeffrey, P; Rabinowitz, J; Semmelhack, M; Hughson, F; Bassler, B

    2009-01-01

    Vibrio cholerae, the bacterium that causes the disease cholera, controls virulence factor production and biofilm development in response to two extracellular quorum-sensing molecules, called autoinducers. The strongest autoinducer, called CAI-1 (for cholera autoinducer-1), was previously identified as (S)-3-hydroxytridecan-4-one. Biosynthesis of CAI-1 requires the enzyme CqsA. Here, we determine the CqsA reaction mechanism, identify the CqsA substrates as (S)-2-aminobutyrate and decanoyl coenzyme A, and demonstrate that the product of the reaction is 3-aminotridecan-4-one, dubbed amino-CAI-1. CqsA produces amino-CAI-1 by a pyridoxal phosphate-dependent acyl-CoA transferase reaction. Amino-CAI-1 is converted to CAI-1 in a subsequent step via a CqsA-independent mechanism. Consistent with this, we find cells release {ge}100 times more CAI-1 than amino-CAI-1. Nonetheless, V. cholerae responds to amino-CAI-1 as well as CAI-1, whereas other CAI-1 variants do not elicit a quorum-sensing response. Thus, both CAI-1 and amino-CAI-1 have potential as lead molecules in the development of an anticholera treatment.

  12. A Design of Computer Aided Instructions (CAI) for Undirected Graphs in the Discrete Math Tutorial (DMT). Part 1.

    Science.gov (United States)

    1990-06-01

    The objective of this thesis research is to create a tutorial for teaching aspects of undirected graphs in discrete math . It is one of the submodules...of the Discrete Math Tutorial (DMT), which is a Computer Aided Instructional (CAI) tool for teaching discrete math to the Naval Academy and the

  13. A Design of Computer Aided Instructions (CAI) for Undirected Graphs in the Discrete Math Tutorial (DMT). Part 2

    Science.gov (United States)

    1990-06-01

    The objective of this thesis research is to create a tutorial for teaching aspects of undirected graphs in discrete math . It is one of the submodules...of the Discrete Math Tutorial (DMT), which is a Computer Aided Instructional (CAI) tool for teaching discrete math to the Naval Academy and the

  14. E-CAI: a novel server to estimate an expected value of Codon Adaptation Index (eCAI

    Directory of Open Access Journals (Sweden)

    Garcia-Vallvé Santiago

    2008-01-01

    Full Text Available Abstract Background The Codon Adaptation Index (CAI is a measure of the synonymous codon usage bias for a DNA or RNA sequence. It quantifies the similarity between the synonymous codon usage of a gene and the synonymous codon frequency of a reference set. Extreme values in the nucleotide or in the amino acid composition have a large impact on differential preference for synonymous codons. It is thence essential to define the limits for the expected value of CAI on the basis of sequence composition in order to properly interpret the CAI and provide statistical support to CAI analyses. Though several freely available programs calculate the CAI for a given DNA sequence, none of them corrects for compositional biases or provides confidence intervals for CAI values. Results The E-CAI server, available at http://genomes.urv.es/CAIcal/E-CAI, is a web-application that calculates an expected value of CAI for a set of query sequences by generating random sequences with G+C and amino acid content similar to those of the input. An executable file, a tutorial, a Frequently Asked Questions (FAQ section and several examples are also available. To exemplify the use of the E-CAI server, we have analysed the codon adaptation of human mitochondrial genes that codify a subunit of the mitochondrial respiratory chain (excluding those genes that lack a prokaryotic orthologue and are encoded in the nuclear genome. It is assumed that these genes were transferred from the proto-mitochondrial to the nuclear genome and that its codon usage was then ameliorated. Conclusion The E-CAI server provides a direct threshold value for discerning whether the differences in CAI are statistically significant or whether they are merely artifacts that arise from internal biases in the G+C composition and/or amino acid composition of the query sequences.

  15. Hypertext and three-dimensional computer graphics in an all digital PC-based CAI workstation.

    Science.gov (United States)

    Schwarz, D. L.; Wind, G. G.

    1991-01-01

    In the past several years there has been an enormous increase in the number of computer-assisted instructional (CAI) applications. Many medical educators and physicians have recognized the power and utility of hypertext. Some developers have incorporated simple diagrams, scanned monochrome graphics or still frame photographs from a laser disc or CD-ROM into their hypertext applications. These technologies have greatly increased the role of the microcomputer in education and training. There still remain numerous applications for these tools which are yet to be explored. One of these exciting areas involves the use of three-dimensional computer graphics. An all digital platform increases application portability. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:1807767

  16. The Relevance of AI Research to CAI.

    Science.gov (United States)

    Kearsley, Greg P.

    This article provides a tutorial introduction to Artificial Intelligence (AI) research for those involved in Computer Assisted Instruction (CAI). The general theme is that much of the current work in AI, particularly in the areas of natural language understanding systems, rule induction, programming languages, and socratic systems, has important…

  17. Exploring Chondrule and CAI Rims Using Micro- and Nano-Scale Petrological and Compositional Analysis

    Science.gov (United States)

    Cartwright, J. A.; Perez-Huerta, A.; Leitner, J.; Vollmer, C.

    2017-12-01

    As the major components within chondrites, chondrules (mm-sized droplets of quenched silicate melt) and calcium-aluminum-rich inclusions (CAI, refractory) represent the most abundant and the earliest materials that solidified from the solar nebula. However, the exact formation mechanisms of these clasts, and whether these processes are related, remains unconstrained, despite extensive petrological and compositional study. By taking advantage of recent advances in nano-scale tomographical techniques, we have undertaken a combined micro- and nano-scale study of CAI and chondrule rim morphologies, to investigate their formation mechanisms. The target lithologies for this research are Wark-Lovering rims (WLR), and fine-grained rims (FGR) around CAIs and chondrules respectively, present within many chondrites. The FGRs, which are up to 100 µm thick, are of particular interest as recent studies have identified presolar grains within them. These grains predate the formation of our Solar System, suggesting FGR formation under nebular conditions. By contrast, WLRs are 10-20 µm thick, made of different compositional layers, and likely formed by flash-heating shortly after CAI formation, thus recording nebular conditions. A detailed multi-scale study of these respective rims will enable us to better understand their formation histories and determine the potential for commonality between these two phases, despite reports of an observed formation age difference of up to 2-3 Myr. We are using a combination of complimentary techniques on our selected target areas: 1) Micro-scale characterization using standard microscopic and compositional techniques (SEM-EBSD, EMPA); 2) Nano-scale characterization of structures using transmission electron microscopy (TEM) and elemental, isotopic and tomographic analysis with NanoSIMS and atom probe tomography (APT). Preliminary nano-scale APT analysis of FGR morphologies within the Allende carbonaceous chondrite has successfully discerned

  18. A risk management approach to CAIS development

    Science.gov (United States)

    Hart, Hal; Kerner, Judy; Alden, Tony; Belz, Frank; Tadman, Frank

    1986-01-01

    The proposed DoD standard Common APSE Interface Set (CAIS) was developed as a framework set of interfaces that will support the transportability and interoperability of tools in the support environments of the future. While the current CAIS version is a promising start toward fulfilling those goals and current prototypes provide adequate testbeds for investigations in support of completing specifications for a full CAIS, there are many reasons why the proposed CAIS might fail to become a usable product and the foundation of next-generation (1990'S) project support environments such as NASA's Space Station software support environment. The most critical threats to the viability and acceptance of the CAIS include performance issues (especially in piggybacked implementations), transportability, and security requirements. To make the situation worse, the solution to some of these threats appears to be at conflict with the solutions to others.

  19. Coordinated Oxygen Isotopic and Petrologic Studies of CAIS Record Varying Composition of Protosolar

    Science.gov (United States)

    Simon, Justin I.; Matzel, J. E. P.; Simon, S. B.; Weber, P. K.; Grossman, L.; Ross, D. K.; Hutcheon, I. D.

    2012-01-01

    Ca-, Al-rich inclusions (CAIs) record the O-isotope composition of Solar nebular gas from which they grew [1]. High spatial resolution O-isotope measurements afforded by ion microprobe analysis across the rims and margin of CAIs reveal systematic variations in (Delta)O-17 and suggest formation from a diversity of nebular environments [2-4]. This heterogeneity has been explained by isotopic mixing between the O-16-rich Solar reservoir [6] and a second O-16-poor reservoir (probably nebular gas) with a "planetary-like" isotopic composition [e.g., 1, 6-7], but the mechanism and location(s) where these events occur within the protoplanetary disk remain uncertain. The orientation of large and systematic variations in (Delta)O-17 reported by [3] for a compact Type A CAI from the Efremovka reduced CV3 chondrite differs dramatically from reports by [4] of a similar CAI, A37 from the Allende oxidized CV3 chondrite. Both studies conclude that CAIs were exposed to distinct, nebular O-isotope reservoirs, implying the transfer of CAIs among different settings within the protoplanetary disk [4]. To test this hypothesis further and the extent of intra-CAI O-isotopic variation, a pristine compact Type A CAI, Ef-1 from Efremovka, and a Type B2 CAI, TS4 from Allende were studied. Our new results are equally intriguing because, collectively, O-isotopic zoning patterns in the CAIs indicate a progressive and cyclic record. The results imply that CAIs were commonly exposed to multiple environments of distinct gas during their formation. Numerical models help constrain conditions and duration of these events.

  20. CAI System with Multi-Media Text Through Web Browser for NC Lathe Programming

    Science.gov (United States)

    Mizugaki, Yoshio; Kikkawa, Koichi; Mizui, Masahiko; Kamijo, Keisuke

    A new Computer Aided Instruction (CAI) system for NC lathe programming has been developed with use of multi-media texts including movies, animations, pictures, sound and texts through Web browser. Although many CAI systems developed previously for NC programming consist of text-based instructions, it is difficult for beginners to learn NC programming with use of them. In the developed CAI system, multi-media texts are adopted for the help of users' understanding, and it is available through Web browser anytime and anywhere. Also the error log is automatically recorded for the future references. According to the NC programming coded by a user, the movement of the NC lathe is animated and shown in the monitor screen in front of the user. If its movement causes the collision between a cutting tool and the lathe, some sound and the caution remark are generated. If the user makes mistakes some times at a certain stage in learning NC, the corresponding suggestion is shown in the form of movies, animations, and so forth. By using the multimedia texts, users' attention is kept concentrated during a training course. In this paper, the configuration of the CAI system is explained and the actual procedures for users to learn the NC programming are also explained too. Some beginners tested this CAI system and their results are illustrated and discussed from the viewpoint of the efficiency and usefulness of this CAI system. A brief conclusion is also mentioned.

  1. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    Fujii, M.; Sasaki, K.; Ohi, T.; Itoh, T.

    1993-01-01

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  2. 電腦輔助教學與個別教學結合: 電腦輔助教學課堂應用初探 Computer-Assisted Instruction Under the Management of Individualized Instruction: A Classroom Management Approach of CAI

    Directory of Open Access Journals (Sweden)

    Sunny S. J. Lin

    1988-03-01

    Full Text Available 無First reviews the development of Computer. Assisted Instruction (CAI in Taiwan. This study describes the training of teachers from different levels of schools to design CAI coursewares, and the planning of CAI courseware bank possesses 2,000 supplemental coursewares. Some CAI's c1assroom application system should be carefully established to prevent the easy abuse of a CAI courseware as an instructional plan. The study also claims to steer CAI in our elemantary and secondary education could rely on the mastery learning as the instructional plan. In this case, CAI must limit its role as the formative test and remedial material only. In the higher education , the Keller's Personalized System of Instruction could be an effective c1assroom management system. Therefore, CAI will offer study guide and formative test only. Using these 2 instructional system may enhance student's achievement , and speed up the learning rate at the same time. Combining with individualized instruction and CAI will be one of the most workable approach in current c1assroom . The author sets up an experiment 10 varify their effectiveness and efficiency in the near future.

  3. Developing the Coach Analysis and Intervention System (CAIS): establishing validity and reliability of a computerised systematic observation instrument.

    Science.gov (United States)

    Cushion, Christopher; Harvey, Stephen; Muir, Bob; Nelson, Lee

    2012-01-01

    We outline the evolution of a computerised systematic observation tool and describe the process for establishing the validity and reliability of this new instrument. The Coach Analysis and Interventions System (CAIS) has 23 primary behaviours related to physical behaviour, feedback/reinforcement, instruction, verbal/non-verbal, questioning and management. The instrument also analyses secondary coach behaviour related to performance states, recipient, timing, content and questioning/silence. The CAIS is a multi-dimensional and multi-level mechanism able to provide detailed and contextualised data about specific coaching behaviours occurring in complex and nuanced coaching interventions and environments that can be applied to both practice sessions and competition.

  4. INAA of CAIs from the Maralinga CK4 chondrite: Effects of parent body thermal metamorphism

    Science.gov (United States)

    Lindstrom, D. J.; Keller, L. P.; Martinez, R. R.

    1993-01-01

    Maralinga is an anomalous CK4 carbonaceous chondrite which contains numerous Ca-, Al-rich inclusions (CAI's) unlike the other members of the CK group. These CAI's are characterized by abundant green hercynitic spinel intergrown with plagioclase and high-Ca clinopyroxene, and a total lack of melilite. Instrumental Neutron Activation Analysis (INAA) was used to further characterize the meteorite, with special focus on the CAI's. High sensitivity INAA was done on eight sample disks about 100-150 microns in diameter obtained from a normal 30 micron thin section with a diamond microcoring device. The CAI's are enriched by 60-70X bulk meteorite values in Zn, suggesting that the substantial exchange of Fe for Mg that made the spinel in the CAI's hercynitic also allowed efficient scavenging of Zn from the rest of the meteorite during parent body thermal metamorphism. Less mobile elements appear to have maintained their initial heterogeneity.

  5. CAIs in Semarkona (LL3.0)

    Science.gov (United States)

    Mishra, R. K.; Simon, J. I.; Ross, D. K.; Marhas, K. K.

    2016-01-01

    Calcium, Aluminum-rich inclusions (CAIs) are the first forming solids of the Solar system. Their observed abundance, mean size, and mineralogy vary quite significantly between different groups of chondrites. These differences may reflect the dynamics and distinct cosmochemical conditions present in the region(s) of the protoplanetary disk from which each type likely accreted. Only about 11 such objects have been found in L and LL type while another 57 have been found in H type ordinary chondrites, compared to thousands in carbonaceous chondrites. At issue is whether the rare CAIs contained in ordinary chondrites truly reflect a distinct population from the inclusions commonly found in other chondrite types. Semarkona (LL3.00) (fall, 691 g) is the most pristine chondrite available in our meteorite collection. Here we report petrography and mineralogy of 3 CAIs from Semarkona

  6. Particulated articular cartilage: CAIS and DeNovo NT.

    Science.gov (United States)

    Farr, Jack; Cole, Brian J; Sherman, Seth; Karas, Vasili

    2012-03-01

    Cartilage Autograft Implantation System (CAIS; DePuy/Mitek, Raynham, MA) and DeNovo Natural Tissue (NT; ISTO, St. Louis, MO) are novel treatment options for focal articular cartilage defects in the knee. These methods involve the implantation of particulated articular cartilage from either autograft or juvenile allograft donor, respectively. In the laboratory and in animal models, both CAIS and DeNovo NT have demonstrated the ability of the transplanted cartilage cells to "escape" from the extracellular matrix, migrate, multiply, and form a new hyaline-like cartilage tissue matrix that integrates with the surrounding host tissue. In clinical practice, the technique for both CAIS and DeNovo NT is straightforward, requiring only a single surgery to affect cartilage repair. Clinical experience is limited, with short-term studies demonstrating both procedures to be safe, feasible, and effective, with improvements in subjective patient scores, and with magnetic resonance imaging evidence of good defect fill. While these treatment options appear promising, prospective randomized controlled studies are necessary to refine the indications and contraindications for both CAIS and DeNovo NT.

  7. CAI and training system for the emergency operation procedure in the advanced thermal reactor, FUGEN

    International Nuclear Information System (INIS)

    Kozaki, T.; Imanaga, K.; Nakamura, S.; Maeda, K.; Sakurai, N.; Miyamoto, M.

    2003-01-01

    In the Advanced Thermal Reactor (ATR ) of the JNC, 'FUGEN', a symptom based Emergency Operating Procedure (EOF) was introduced in order to operate Fugen more safely and it became necessary for the plant operators to master the EOF. However it took a lot of time for the instructor to teach the EOP to operators and to train them. Thus, we have developed a Computer Aided Instruction (CAI) and Training System for the EOP, by which the operators can learn the EOP and can be trained. This system has two major functions, i.e., CAI and training. In the CAI function, there are three learning courses, namely, the EOP procedure, the simulation with guidance and Q and A, and the free simulation. In the training function, all of necessary control instruments (indicators, switches, annunciators and so forth) and physics models for the EOP training are simulated so that the trainees can be trained for all of the EOPs. In addition, 50 kinds of malfunction models are installed in order to perform appropriate accident scenarios for the EOP. The training of the EOP covers the range from AOO (Anticipated Operational Occurrence) to Over-DBAs (Design Based Accidents). This system is built in three personal computers that are connected by the computer network. One of the computers is expected to be used for the instructor and the other two are for the trainees. The EOP is composed of eight guidelines, such as 'Reactor Control' and 'Depression and Cooling', and the operation screens which are corresponded to the guidelines are respectively provided. According to the trial, we have estimated that the efficiency of the learning and the training would be improved about 30% for the trainee and about 75% for the instructor in the actual learning and training. (author)

  8. The Impact of Different Support Vectors on GOSAT-2 CAI-2 L2 Cloud Discrimination

    Directory of Open Access Journals (Sweden)

    Yu Oishi

    2017-11-01

    Full Text Available Greenhouse gases Observing SATellite-2 (GOSAT-2 will be launched in fiscal year 2018. GOSAT-2 will be equipped with two sensors: the Thermal and Near-infrared Sensor for Carbon Observation (TANSO-Fourier Transform Spectrometer 2 (FTS-2 and the TANSO-Cloud and Aerosol Imager 2 (CAI-2. CAI-2 is a push-broom imaging sensor that has forward- and backward-looking bands to observe the optical properties of aerosols and clouds and to monitor the status of urban air pollution and transboundary air pollution over oceans, such as PM2.5 (particles less than 2.5 micrometers in diameter. CAI-2 has important applications for cloud discrimination in each direction. The Cloud and Aerosol Unbiased Decision Intellectual Algorithm (CLAUDIA1, which applies sequential threshold tests to features is used for GOSAT CAI L2 cloud flag processing. If CLAUDIA1 is used with CAI-2, it is necessary to optimize the thresholds in accordance with CAI-2. However, CLAUDIA3 with support vector machines (SVM, a supervised pattern recognition method, was developed, and then we applied CLAUDIA3 for GOSAT-2 CAI-2 L2 cloud discrimination processing. Thus, CLAUDIA3 can automatically find the optimized boundary between clear and cloudy areas. Improvements in CLAUDIA3 using CAI (CLAUDIA3-CAI continue to be made. In this study, we examined the impact of various support vectors (SV on GOSAT-2 CAI-2 L2 cloud discrimination by analyzing (1 the impact of the choice of different time periods for the training data and (2 the impact of different generation procedures for SV on the cloud discrimination efficiency. To generate SV for CLAUDIA3-CAI from MODIS data, there are two times at which features are extracted, corresponding to CAI bands. One procedure is equivalent to generating SV using CAI data. Another procedure generates SV for MODIS cloud discrimination at the beginning, and then extracts decision function, thresholds, and SV corresponding to CAI bands. Our results indicated the following

  9. Oxygen isotope variations at the margin of a CAI records circulation within the solar nebula.

    Science.gov (United States)

    Simon, Justin I; Hutcheon, Ian D; Simon, Steven B; Matzel, Jennifer E P; Ramon, Erick C; Weber, Peter K; Grossman, Lawrence; DePaolo, Donald J

    2011-03-04

    Micrometer-scale analyses of a calcium-, aluminum-rich inclusion (CAI) and the characteristic mineral bands mantling the CAI reveal that the outer parts of this primitive object have a large range of oxygen isotope compositions. The variations are systematic; the relative abundance of (16)O first decreases toward the CAI margin, approaching a planetary-like isotopic composition, then shifts to extremely (16)O-rich compositions through the surrounding rim. The variability implies that CAIs probably formed from several oxygen reservoirs. The observations support early and short-lived fluctuations of the environment in which CAIs formed, either because of transport of the CAIs themselves to distinct regions of the solar nebula or because of varying gas composition near the proto-Sun.

  10. Computer based training for nuclear operations personnel: From concept to reality

    International Nuclear Information System (INIS)

    Widen, W.C.; Klemm, R.W.

    1986-01-01

    Computer Based Training (CBT) can be subdivided into two categories: Computer Aided Instruction (CAI), or the actual presentation of learning material; and Computer Managed Instruction (CMI), the tracking, recording, and documenting of instruction and student progress. Both CAI and CMI can be attractive to the student and to the training department. A brief overview of CAI and CMI benefits is given in this paper

  11. Effectiveness of Using Computer-Assisted Supplementary Instruction for Teaching the Mole Concept

    Science.gov (United States)

    Yalçinalp, Serpil; Geban, Ömer; Özkan, Ilker

    This study examined the effect of computer-assisted instruction (CAI), used as a problem-solving supplement to classroom instruction, on students' understanding of chemical formulas and mole concept, their attitudes toward chemistry subjects, and CAI. The objective was to assess the effectiveness of CAI over recitation hours when both teaching methods were used as a supplement to the traditional chemistry instruction. We randomly selected two classes in a secondary school. Each teaching strategy was randomly assigned to one class. The experimental group received supplementary instruction delivered via CAI, while the control group received similar instruction through recitation hours. The data were analyzed using two-way analysis of variance and t-test. It was found that the students who used the CAI accompanied with lectures scored significantly higher than those who attended recitation hours, in terms of school subject achievement in chemistry and attitudes toward chemistry subjects. In addition, there was a significant improvement in the attitudes of students in the experimental group toward the use of computers in a chemistry course. There was no significant difference between the performances of females versus males in each treatment group.Received: 26 April 1994; Revised: 6 April 1995;

  12. A Model Driven Question-Answering System for a CAI Environment. Final Report (July 1970 to May 1972).

    Science.gov (United States)

    Brown, John S.; And Others

    A question answering system which permits a computer-assisted instruction (CAI) student greater initiative in the variety of questions he can ask is described. A method is presented to represent the dynamic processes of a subject matter area by augmented finite state automata, which permits efficient inferencing about dynamic processes and…

  13. A multielement isotopic study of refractory FUN and F CAIs: Mass-dependent and mass-independent isotope effects

    Science.gov (United States)

    Kööp, Levke; Nakashima, Daisuke; Heck, Philipp R.; Kita, Noriko T.; Tenner, Travis J.; Krot, Alexander N.; Nagashima, Kazuhide; Park, Changkun; Davis, Andrew M.

    2018-01-01

    Calcium-aluminum-rich inclusions (CAIs) are the oldest dated objects that formed inside the Solar System. Among these are rare, enigmatic objects with large mass-dependent fractionation effects (F CAIs), which sometimes also have large nucleosynthetic anomalies and a low initial abundance of the short-lived radionuclide 26Al (FUN CAIs). We have studied seven refractory hibonite-rich CAIs and one grossite-rich CAI from the Murchison (CM2) meteorite for their oxygen, calcium, and titanium isotopic compositions. The 26Al-26Mg system was also studied in seven of these CAIs. We found mass-dependent heavy isotope enrichment in all measured elements, but never simultaneously in the same CAI. The data are hard to reconcile with a single-stage melt evaporation origin and may require reintroduction or reequilibration for magnesium, oxygen and titanium after evaporation for some of the studied CAIs. The initial 26Al/27Al ratios inferred from model isochrons span a range from <1 × 10-6 to canonical (∼5 × 10-5). The CAIs show a mutual exclusivity relationship between inferred incorporation of live 26Al and the presence of resolvable anomalies in 48Ca and 50Ti. Furthermore, a relationship exists between 26Al incorporation and Δ17O in the hibonite-rich CAIs (i.e., 26Al-free CAIs have resolved variations in Δ17O, while CAIs with resolved 26Mg excesses have Δ17O values close to -23‰). Only the grossite-rich CAI has a relatively enhanced Δ17O value (∼-17‰) in spite of a near-canonical 26Al/27Al. We interpret these data as indicating that fractionated hibonite-rich CAIs formed over an extended time period and sampled multiple stages in the isotopic evolution of the solar nebula, including: (1) an 26Al-poor nebula with large positive and negative anomalies in 48Ca and 50Ti and variable Δ17O; (2) a stage of 26Al-admixture, during which anomalies in 48Ca and 50Ti had been largely diluted and a Δ17O value of ∼-23‰ had been achieved in the CAI formation region; and (3

  14. Cognitive Assessment Interview (CAI): Validity as a co-primary measure of cognition across phases of schizophrenia.

    Science.gov (United States)

    Ventura, Joseph; Subotnik, Kenneth L; Ered, Arielle; Hellemann, Gerhard S; Nuechterlein, Keith H

    2016-04-01

    Progress has been made in developing interview-based measures for the assessment of cognitive functioning, such as the Cognitive Assessment Interview (CAI), as co-primary measures that compliment objective neurocognitive assessments and daily functioning. However, a few questions remain, including whether the relationships with objective cognitive measures and daily functioning are high enough to justify the CAI as an co-primary measure and whether patient-only assessments are valid. Participants were first-episode schizophrenia patients (n=60) and demographically-similar healthy controls (n=35), chronic schizophrenia patients (n=38) and demographically similar healthy controls (n=19). Participants were assessed at baseline with an interview-based measure of cognitive functioning (CAI), a test of objective cognitive functioning, functional capacity, and role functioning at baseline, and in the first episode patients again 6 months later (n=28). CAI ratings were correlated with objective cognitive functioning, functional capacity, and functional outcomes in first-episode schizophrenia patients at similar magnitudes as in chronic patients. Comparisons of first-episode and chronic patients with healthy controls indicated that the CAI sensitively detected deficits in schizophrenia. The relationship of CAI Patient-Only ratings with objective cognitive functioning, functional capacity, and daily functioning were comparable to CAI Rater scores that included informant information. These results confirm in an independent sample the relationship of the CAI ratings with objectively measured cognition, functional capacity, and role functioning. Comparison of schizophrenia patients with healthy controls further validates the CAI as an co-primary measure of cognitive deficits. Also, CAI change scores were strongly related to objective cognitive change indicating sensitivity to change. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Computer-Assisted Instruction: Authoring Languages. ERIC Digest.

    Science.gov (United States)

    Reeves, Thomas C.

    One of the most perplexing tasks in producing computer-assisted instruction (CAI) is the authoring process. Authoring is generally defined as the process of turning the flowcharts, control algorithms, format sheets, and other documentation of a CAI program's design into computer code that will operationalize the simulation on the delivery system.…

  16. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  17. NWA10758: A New CV3 Chondrite Bearing a Giant CAI with Hibonite-Rich Wark-Lovering Rim

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Zolensky, M.

    2017-01-01

    Northwest Africa (NWA) 10758 is a newly identified carbonaceous chondrite that is a Bali-like oxidized CV3. The large Ca-Al rich inclusion (CAI) in this sample is approx. 2.4 x 1.4 cm. The CAI is transitional in composition between type A and type B, with interior mineralogy dominated by melilite, plus less abundant spinel and Al-Ti rich diopside, and only very minor anorthite (Fig. 1A). This CAI is largely free of secondary alteration in the exposed section we examined, with almost no nepheline, sodalite or Ca-Fe silicates. The Wark-Lovering (WL) rim on this CAI is dominated by hibonite, with lower abundances of spinel and perovskite, and with hibonite locally overlain by melilite plus perovskite (as in Fig. 1B). Note that the example shown in 1B is exceptional. Around most of the CAI, hibonite + spinel + perovskite form the WL rim, without overlying melilite. The WL rim can be unusually thick, ranging from approx. 20 microns up to approx. 150 microns. A well-developed, stratified accretionary rim infills embayments of the CAI, and thins over protuberances in the convoluted CAI surface.

  18. Structural basis of Na+-independent and cooperative substrate/product antiport in CaiT

    NARCIS (Netherlands)

    Schulze, Sabrina; Köster, Stefan; Geldmacher, Ulrike; Terwisscha van Scheltinga, Anke C.; Kühlbrandt, Werner

    2010-01-01

    Transport of solutes across biological membranes is performed by specialized secondary transport proteins in the lipid bilayer, and is essential for life. Here we report the structures of the sodium-independent carnitine/butyrobetaine antiporter CaiT from Proteus mirabilis (PmCaiT) at 2.3-Å and from

  19. Ca-Fe and Alkali-Halide Alteration of an Allende Type B CAI: Aqueous Alteration in Nebular or Asteroidal Settings

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Simon, S. B.; Grossman, L.

    2012-01-01

    Ca-Fe and alkali-halide alteration of CAIs is often attributed to aqueous alteration by fluids circulating on asteroidal parent bodies after the various chondritic components have been assembled, although debate continues about the roles of asteroidal vs. nebular modification processes [1-7]. Here we report de-tailed observations of alteration products in a large Type B2 CAI, TS4 from Allende, one of the oxidized subgroup of CV3s, and propose a speculative model for aqueous alteration of CAIs in a nebular setting. Ca-Fe alteration in this CAI consists predominantly of end-member hedenbergite, end-member andradite, and compositionally variable, magnesian high-Ca pyroxene. These phases are strongly concentrated in an unusual "nodule" enclosed within the interior of the CAI (Fig. 1). The Ca, Fe-rich nodule superficially resembles a clast that pre-dated and was engulfed by the CAI, but closer inspection shows that relic spinel grains are enclosed in the nodule, and corroded CAI primary phases interfinger with the Fe-rich phases at the nodule s margins. This CAI also contains abundant sodalite and nepheline (alkali-halide) alteration that occurs around the rims of the CAI, but also penetrates more deeply into the CAI. The two types of alteration (Ca-Fe and alkali-halide) are adjacent, and very fine-grained Fe-rich phases are associated with sodalite-rich regions. Both types of alteration appear to be replacive; if that is true, it would require substantial introduction of Fe, and transport of elements (Ti, Al and Mg) out of the nodule, and introduction of Na and Cl into alkali-halide rich zones. Parts of the CAI have been extensively metasomatized.

  20. Sexual life and sexual wellness in individuals with complete androgen insensitivity syndrome (CAIS) and Mayer-Rokitansky-Küster-Hauser Syndrome (MRKHS).

    Science.gov (United States)

    Fliegner, Maike; Krupp, Kerstin; Brunner, Franziska; Rall, Katharina; Brucker, Sara Y; Briken, Peer; Richter-Appelt, Hertha

    2014-03-01

    Sexual wellness depends on a person's physical and psychological constitution. Complete Androgen Insensitivity Syndrome (CAIS) and Mayer-Rokitansky-Küster-Hauser Syndrome (MRKHS) can compromise sexual well-being. To compare sexual well-being in CAIS and MRKHS using multiple measures: To assess sexual problems and perceived distress. To gain insight into participants' feelings of inadequacy in social and sexual situations, level of self-esteem and depression. To determine how these psychological factors relate to sexual (dys)function. To uncover what participants see as the source of their sexual problems. Data were collected using a paper-and-pencil questionnaire. Eleven individuals with CAIS and 49 with MRKHS with/without neovagina treatment were included. Rates of sexual dysfunctions, overall sexual function, feelings of inadequacy in social and sexual situations, self-esteem and depression scores were calculated. Categorizations were used to identify critical cases. Correlations between psychological variables and sexual function were computed. Sexually active subjects were compared with sexually not active participants. A qualitative content analysis was carried out to explore causes of sexual problems. An extended list of sexual problems based on the Diagnostic and Statistical Manual of Mental Disorders, 4th ed., text revision, by the American Psychiatric Association and related distress. Female Sexual Function Index (FSFI), German Questionnaire on Feelings of Inadequacy in Social and Sexual Situations (FUSS social scale, FUSS sexual scale), Rosenberg Self-Esteem Scale (RSE), Brief Symptom Inventory (BSI) subscale depression. Open question on alleged causes of sexual problems. The results point to a far-reaching lack of sexual confidence and sexual satisfaction in CAIS. In MRKHS apprehension in sexual situations is a source of distress, but sexual problems seem to be more focused on issues of vaginal functioning. MRKHS women report being satisfied with their

  1. Northwest Africa 10758: A New CV3 Chondrite Bearing a Giant CAI with Hibonite-Rich Wark-Lovering Rim

    Science.gov (United States)

    Ross, D. K.; Simon, J. I.; Zolensky, M.

    2017-01-01

    Northwest Africa (NWA) 10758 is a newly identified carbonaceous chondrite that is a Bali-like oxidized CV3. The large Ca-Al rich inclusion (CAI) in this sample is approx. 2.4 x 1.4 cm. The CAI is transitional in composition between type A and type B, with interior mineralogy dominated by melilite, plus less abundant spinel and Al-Ti rich diopside, and only very minor anorthite (Fig. 1A). This CAI is largely free of secondary alteration in the exposed section we examined, with almost no nepheline, sodalite or Ca-Fe silicates. The Wark-Lovering (WL) rim on this CAI is dominated by hibonite, with lower abundances of spinel and perovskite, and with hibonite locally overlain by melilite plus perovskite (as in Fig. 1B). Note that the example shown in 1B is exceptional. Around most of the CAI, hibonite + spinel + perovskite form the WL rim, without overlying melilite. The WL rim can be unusually thick, ranging from approx.20 microns up to approx. 150 microns. A well-developed, stratified accretionary rim infills embayments of the CAI, and thins over protuberances in the convoluted CAI surface.

  2. An ion microprobe study of CAIs from CO3 meteorites. [Abstract only

    Science.gov (United States)

    Russell, S. S.; Greenwood, R. C.; Fahey, A. J.; Huss, G. R.; Wasserburg, G. J.

    1994-01-01

    When attempting to interpret the history of Ca, Al-rich inclusions (CAIs) it is often difficult to distinguish between primary features inherited from the nebula and those produced during secondary processing on the parent body. We have undertaken a systematic study of CAIs from 10 CO chondrites, believed to represent a metamorphic sequence with the goal of distinguishing primary and secondary features. ALHA 77307 (3.0), Colony (3.0), Kainsaz (3.1), Felix (3.2), ALH 82101 (3.3), Ornans (3.3), Lance (3.4), ALHA 77003 (3.5), Warrenton (3.6), and Isna (3.7) were examined by Scanning Electron Microscopy (SEM) and optical microscopy. We have identified 141 CAIs within these samples, and studied in detail the petrology of 34 inclusions. The primary phases in the lower petrologic types are spinel, melilite, and hibonite. Perovskite, FeS, ilmenite, anorthite, kirschsteinite, and metallic Fe are present as minor phases. Melilite becomes less abundant in higher petrologic types and was not detected in chondrites of type 3.5 and above, confirming previous reports that this mineral easily breaks down during heating. Iron, an element that would not be expected to condense at high temperatures, has a lower abundance in spinel from low-petrologic-type meteorites than those of higher grade, and CaTiO3 is replaced by FeTiO3 in meteorites of higher petrologic type. The abundance of CAIs is similar in each meteorite. Eight inclusions have been analyzed by ion probe. The results are summarized. The results obtained to date show that CAIs in CO meteorites, like those from other meteorite classes, contain Mg* and that Mg in some inclusions has been redistributed.

  3. Gender Role, Gender Identity and Sexual Orientation in CAIS ("XY-Women") Compared With Subfertile and Infertile 46,XX Women.

    Science.gov (United States)

    Brunner, Franziska; Fliegner, Maike; Krupp, Kerstin; Rall, Katharina; Brucker, Sara; Richter-Appelt, Hertha

    2016-01-01

    The perception of gender development of individuals with complete androgen insensitivity syndrome (CAIS) as unambiguously female has recently been challenged in both qualitative data and case reports of male gender identity. The aim of the mixed-method study presented was to examine the self-perception of CAIS individuals regarding different aspects of gender and to identify commonalities and differences in comparison with subfertile and infertile XX-chromosomal women with diagnoses of Mayer-Rokitansky-Küster-Hauser syndrome (MRKHS) and polycystic ovary syndrome (PCOS). The study sample comprised 11 participants with CAIS, 49 with MRKHS, and 55 with PCOS. Gender identity was assessed by means of a multidimensional instrument, which showed significant differences between the CAIS group and the XX-chromosomal women. Other-than-female gender roles and neither-female-nor-male sexes/genders were reported only by individuals with CAIS. The percentage with a not exclusively androphile sexual orientation was unexceptionally high in the CAIS group compared to the prevalence in "normative" women and the clinical groups. The findings support the assumption made by Meyer-Bahlburg ( 2010 ) that gender outcome in people with CAIS is more variable than generally stated. Parents and professionals should thus be open to courses of gender development other than typically female in individuals with CAIS.

  4. Adaptation of an aerosol retrieval algorithm using multi-wavelength and multi-pixel information of satellites (MWPM) to GOSAT/TANSO-CAI

    Science.gov (United States)

    Hashimoto, M.; Takenaka, H.; Higurashi, A.; Nakajima, T.

    2017-12-01

    Aerosol in the atmosphere is an important constituent for determining the earth's radiation budget, so the accurate aerosol retrievals from satellite is useful. We have developed a satellite remote sensing algorithm to retrieve the aerosol optical properties using multi-wavelength and multi-pixel information of satellite imagers (MWPM). The method simultaneously derives aerosol optical properties, such as aerosol optical thickness (AOT), single scattering albedo (SSA) and aerosol size information, by using spatial difference of wavelegths (multi-wavelength) and surface reflectances (multi-pixel). The method is useful for aerosol retrieval over spatially heterogeneous surface like an urban region. In this algorithm, the inversion method is a combination of an optimal method and smoothing constraint for the state vector. Furthermore, this method has been combined with the direct radiation transfer calculation (RTM) numerically solved by each iteration step of the non-linear inverse problem, without using look up table (LUT) with several constraints. However, it takes too much computation time. To accelerate the calculation time, we replaced the RTM with an accelerated RTM solver learned by neural network-based method, EXAM (Takenaka et al., 2011), using Rster code. And then, the calculation time was shorternd to about one thouthandth. We applyed MWPM combined with EXAM to GOSAT/TANSO-CAI (Cloud and Aerosol Imager). CAI is a supplement sensor of TANSO-FTS, dedicated to measure cloud and aerosol properties. CAI has four bands, 380, 674, 870 and 1600 nm, and observes in 500 meters resolution for band1, band2 and band3, and 1.5 km for band4. Retrieved parameters are aerosol optical properties, such as aerosol optical thickness (AOT) of fine and coarse mode particles at a wavelenth of 500nm, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength by combining a minimum reflectance method and Fukuda et al. (2013). We will show

  5. Numerical Investigation Into Effect of Fuel Injection Timing on CAI/HCCI Combustion in a Four-Stroke GDI Engine

    Science.gov (United States)

    Cao, Li; Zhao, Hua; Jiang, Xi; Kalian, Navin

    2006-02-01

    The Controlled Auto-Ignition (CAI) combustion, also known as Homogeneous Charge Compression Ignition (HCCI), was achieved by trapping residuals with early exhaust valve closure in conjunction with direct injection. Multi-cycle 3D engine simulations have been carried out for parametric study on four different injection timings in order to better understand the effects of injection timings on in-cylinder mixing and CAI combustion. The full engine cycle simulation including complete gas exchange and combustion processes was carried out over several cycles in order to obtain the stable cycle for analysis. The combustion models used in the present study are the Shell auto-ignition model and the characteristic-time combustion model, which were modified to take the high level of EGR into consideration. A liquid sheet breakup spray model was used for the droplet breakup processes. The analyses show that the injection timing plays an important role in affecting the in-cylinder air/fuel mixing and mixture temperature, which in turn affects the CAI combustion and engine performance.

  6. The Cognitive Assessment Interview (CAI): development and validation of an empirically derived, brief interview-based measure of cognition.

    Science.gov (United States)

    Ventura, Joseph; Reise, Steven P; Keefe, Richard S E; Baade, Lyle E; Gold, James M; Green, Michael F; Kern, Robert S; Mesholam-Gately, Raquelle; Nuechterlein, Keith H; Seidman, Larry J; Bilder, Robert M

    2010-08-01

    Practical, reliable "real world" measures of cognition are needed to supplement neurocognitive performance data to evaluate possible efficacy of new drugs targeting cognitive deficits associated with schizophrenia. Because interview-based measures of cognition offer one possible approach, data from the MATRICS initiative (n=176) were used to examine the psychometric properties of the Schizophrenia Cognition Rating Scale (SCoRS) and the Clinical Global Impression of Cognition in Schizophrenia (CGI-CogS). We used classical test theory methods and item response theory to derive the 10-item Cognitive Assessment Interview (CAI) from the SCoRS and CGI-CogS ("parent instruments"). Sources of information for CAI ratings included the patient and an informant. Validity analyses examined the relationship between the CAI and objective measures of cognitive functioning, intermediate measures of cognition, and functional outcome. The rater's score from the newly derived CAI (10 items) correlate highly (r=.87) with those from the combined set of the SCoRS and CGI-CogS (41 items). Both the patient (r=.82) and the informant (r=.95) data were highly correlated with the rater's score. The CAI was modestly correlated with objectively measured neurocognition (r=-.32), functional capacity (r=-.44), and functional outcome (r=-.32), which was comparable to the parent instruments. The CAI allows for expert judgment in evaluating a patient's cognitive functioning and was modestly correlated with neurocognitive functioning, functional capacity, and functional outcome. The CAI is a brief, repeatable, and potentially valuable tool for rating cognition in schizophrenia patients who are participating in clinical trials. Copyright 2010 Elsevier B.V. All rights reserved.

  7. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    Science.gov (United States)

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  8. Computer-Assisted Mathematics Instruction for Students with Specific Learning Disability: A Review of the Literature

    Science.gov (United States)

    Stultz, Sherry L.

    2017-01-01

    This review was conducted to evaluate the current body of scholarly research regarding the use of computer-assisted instruction (CAI) to teach mathematics to students with specific learning disability (SLD). For many years, computers are utilized for educational purposes. However, the effectiveness of CAI for teaching mathematics to this specific…

  9. An Evaluation of Computer-Aided Instruction in an Introductory Biostatistics Course.

    Science.gov (United States)

    Forsythe, Alan B.; Freed, James R.

    1979-01-01

    Evaluates the effectiveness of computer assisted instruction for teaching biostatistics to first year students at the UCLA School of Dentistry. Results do not demonstrate the superiority of CAI but do suggest that CAI compares favorably to conventional lecture and programed instruction methods. (RAO)

  10. Computer-assisted instruction: a library service for the community teaching hospital.

    Science.gov (United States)

    McCorkel, J; Cook, V

    1986-04-01

    This paper reports on five years of experience with computer-assisted instruction (CAI) at Winthrop-University Hospital, a major affiliate of the SUNY at Stony Brook School of Medicine. It compares CAI programs available from Ohio State University and Massachusetts General Hospital (accessed by telephone and modem), and software packages purchased from the Health Sciences Consortium (MED-CAPS) and Scientific American (DISCOTEST). The comparison documents one library's experience of the cost of these programs and the use made of them by medical students, house staff, and attending physicians. It describes the space allocated for necessary equipment, as well as the marketing of CAI. Finally, in view of the decision of the National Board of Medical Examiners to administer the Part III examination on computer (the so-called CBX) starting in 1988, the paper speculates on the future importance of CAI in the community teaching hospital.

  11. A Braça da Rede, uma Técnica Caiçara de Medir

    Directory of Open Access Journals (Sweden)

    Gilberto Chieus Jr.

    2009-08-01

    Full Text Available Este artigo relata como os caiçaras da cidade de Ubatuba litoral norte paulista medem suas redes de pesca.Mas antes de estar analisando sua técnica de medir estaremos fazendo uma pequena abordagem da cultura caiçara e suas transformações. Em seguida mostraremos alguns momentos históricos da construção do metro. Depois como os caiçaras medem suas redes e o problema ocorrido no Brasil na implantação do sistema métrico decimal e a resistência de determinadas civilizações que se utiliza de outros padrões para realizar suas medidas, ignorando o atual sistema métrico, devidos o seu contexto cultural. Toda esta discussão está enfocada numa perspectiva histórica da Etnomatemática.

  12. Thermal and chemical evolution in the early solar system as recorded by FUN CAIs: Part I - Petrology, mineral chemistry, and isotopic composition of Allende FUN CAI CMS-1

    Science.gov (United States)

    Williams, C. D.; Ushikubo, T.; Bullock, E. S.; Janney, P. E.; Hines, R. R.; Kita, N. T.; Hervig, R. L.; MacPherson, G. J.; Mendybaev, R. A.; Richter, F. M.; Wadhwa, M.

    2017-03-01

    Detailed petrologic, geochemical and isotopic analyses of a new FUN CAI from the Allende CV3 meteorite (designated CMS-1) indicate that it formed by extensive melting and evaporation of primitive precursor material(s). The precursor material(s) condensed in a 16O-rich region (δ17O and δ18O ∼ -49‰) of the inner solar nebula dominated by gas of solar composition at total pressures of ∼10-3-10-6 bar. Subsequent melting of the precursor material(s) was accompanied by evaporative loss of magnesium, silicon and oxygen resulting in large mass-dependent isotope fractionations in these elements (δ25Mg = 30.71-39.26‰, δ29Si = 14.98-16.65‰, and δ18O = -41.57 to -15.50‰). This evaporative loss resulted in a bulk composition similar to that of compact Type A and Type B CAIs, but very distinct from the composition of the original precursor condensate(s). Kinetic fractionation factors and the measured mass-dependent fractionation of silicon and magnesium in CMS-1 suggest that ∼80% of the silicon and ∼85% of the magnesium were lost from its precursor material(s) through evaporative processes. These results suggest that the precursor material(s) of normal and FUN CAIs condensed in similar environments, but subsequently evolved under vastly different conditions such as total gas pressure. The chemical and isotopic differences between normal and FUN CAIs could be explained by sorting of early solar system materials into distinct physical and chemical regimes, in conjunction with discrete heating events, within the protoplanetary disk.

  13. Computer-Assisted Instruction: A Case Study of Two Charter Schools

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2013-01-01

    The purpose of this study was to examine the relationship in achievement gap between English language learners (ELLs) utilizing computer-assisted instruction (CAI) in the classroom, and ELLs relying solely on traditional classroom instruction. The study findings showed that students using CAI to supplement traditional lectures performed better…

  14. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    Yoneyama, Mitsuru; Kubota, Ryuji; Fujiwara, Tadashi; Sakuma, Hitoshi

    1999-01-01

    The education and training system for Accident Management was developed by the Japanese BWR group and Hitachi Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the education and training system with computer simulations. Both systems are designed to be executed on personal computers. The outlines of the CAI education system and the education and training system with simulator are reported below. These systems provides plant operators and technical support center staff with the effective education and training for accident management. (author)

  15. OXYGEN ISOTOPIC COMPOSITIONS OF THE ALLENDE TYPE C CAIs: EVIDENCE FOR ISOTOPIC EXCHANGE DURING NEBULAR MELTING AND ASTEROIDAL THERMAL METAMORPHISM

    Energy Technology Data Exchange (ETDEWEB)

    Krot, A N; Chaussidon, M; Yurimoto, H; Sakamoto, N; Nagashima, K; Hutcheon, I D; MacPherson, G J

    2008-02-21

    Based on the mineralogy and petrography, coarse-grained, igneous, anorthite-rich (Type C) calcium-aluminum-rich inclusions (CAIs) in the CV3 carbonaceous chondrite Allende have been recently divided into three groups: (i) CAIs with melilite and Al,Ti-diopside of massive and lacy textures (coarse grains with numerous rounded inclusions of anorthite) in a fine-grained anorthite groundmass (6-1-72, 100, 160), (ii) CAI CG5 with massive melilite, Al,Ti-diopside and anorthite, and (iii) CAIs associated with chondrule material: either containing chondrule fragments in their peripheries (ABC, TS26) or surrounded by chondrule-like, igneous rims (93) (Krot et al., 2007a,b). Here, we report in situ oxygen isotopic measurements of primary (melilite, spinel, Al,Ti-diopside, anorthite) and secondary (grossular, monticellite, forsterite) minerals in these CAIs. Spinel ({Delta}{sup 17}O = -25{per_thousand} to -20{per_thousand}), massive and lacy Al,Ti-diopside ({Delta}{sup 17}O = -20{per_thousand} to -5{per_thousand}) and fine-grained anorthite ({Delta}{sup 17}O = -15{per_thousand} to -2{per_thousand}) in 100, 160 and 6-1-72 are {sup 16}O-enriched relative spinel and coarse-grained Al,Ti-diopside and anorthite in ABC, 93 and TS26 ({Delta}{sup 17}O ranges from -20{per_thousand} to -15{per_thousand}, from -15{per_thousand} to -5{per_thousand}, and from -5{per_thousand} to 0{per_thousand}, respectively). In 6-1-72, massive and lacy Al,Ti-diopside grains are {sup 16}O-depleted ({Delta}{sup 17}O {approx} -13{per_thousand}) relative to spinel ({Delta}{sup 17}O = -23{per_thousand}). Melilite is the most {sup 16}O-depleted mineral in all Allende Type C CAIs. In CAI 100, melilite and secondary grossular, monticellite and forsterite (minerals replacing melilite) are similarly {sup 16}O-depleted, whereas grossular in CAI 160 is {sup 16}O-enriched ({Delta}{sup 17}O = -10{per_thousand} to -6{per_thousand}) relative to melilite ({Delta}{sup 17}O = -5{per_thousand} to -3{per_thousand}). We infer

  16. Stable Magnesium Isotope Variation in Melilite Mantle of Allende Type B1 CAI EK 459-5-1

    Science.gov (United States)

    Kerekgyarto, A. G.; Jeffcoat, C. R.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.

    2014-01-01

    Ca-Al-rich inclusions (CAIs) are the earliest formed crystalline material in our solar system and they record early Solar System processes. Here we present petrographic and delta Mg-25 data of melilite mantles in a Type B1 CAI that records early solar nebular processes.

  17. An experimental study of fuel injection strategies in CAI gasoline engine

    Energy Technology Data Exchange (ETDEWEB)

    Hunicz, J.; Kordos, P. [Department of Combustion Engines and Transport, Lublin University of Technology, Nadbystrzycka 36, 20-618 Lublin (Poland)

    2011-01-15

    Combustion of gasoline in a direct injection controlled auto-ignition (CAI) single-cylinder research engine was studied. CAI operation was achieved with the use of the negative valve overlap (NVO) technique and internal exhaust gas re-circulation (EGR). Experiments were performed at single injection and split injection, where some amount of fuel was injected close to top dead centre (TDC) during NVO interval, and the second injection was applied with variable timing. Additionally, combustion at variable fuel-rail pressure was examined. Investigation showed that at fuel injection into recompressed exhaust fuel reforming took place. This process was identified via an analysis of the exhaust-fuel mixture composition after NVO interval. It was found that at single fuel injection in NVO phase, its advance determined the heat release rate and auto-ignition timing, and had a strong influence on NO{sub X} emission. However, a delay of single injection to intake stroke resulted in deterioration of cycle-to-cycle variability. Application of split injection showed benefits of this strategy versus single injection. Examinations of different fuel mass split ratios and variable second injection timing resulted in further optimisation of mixture formation. At equal share of the fuel mass injected in the first injection during NVO and in the second injection at the beginning of compression, the lowest emission level and cyclic variability improvement were observed. (author)

  18. Computer-Assisted Instruction to Teach DOS Commands: A Pilot Study.

    Science.gov (United States)

    McWeeney, Mark G.

    1992-01-01

    Describes a computer-assisted instruction (CAI) program used to teach DOS commands. Pretest and posttest results for 65 graduate students using the program are reported, and it is concluded that the CAI program significantly aided the students. Sample screen displays for the program and several questions from the pre/posttest are included. (nine…

  19. The Use of Modular Computer-Based Lessons in a Modification of the Classical Introductory Course in Organic Chemistry.

    Science.gov (United States)

    Stotter, Philip L.; Culp, George H.

    An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…

  20. Multiple Nebular Gas Reservoirs Recorded by Oxygen Isotope Variation in a Spinel-rich CAI in CO3 MIL 090019

    Science.gov (United States)

    Simon, J. I.; Simon, S. B.; Nguyen, A. N.; Ross, D. K.; Messenger, S.

    2017-01-01

    We conducted NanoSIMS O-isotopic imaging of a primitive spinel-rich CAI spherule (27-2) from the MIL 090019 CO3 chondrite. Inclusions such as 27-2 are proposed to record inner nebula processes during an epoch of rapid solar nebula evolution. Mineralogical and textural analyses suggest that this CAI formed by high temperature reactions, partial melting, and condensation. This CAI exhibits radial O-isotopic heterogeneity among multiple occurrences of the same mineral, reflecting interactions with distinct nebular O-isotopic reservoirs.

  1. Computer Assisted Instruction

    Science.gov (United States)

    Higgins, Paul

    1976-01-01

    Methodology for developing a computer assisted instruction (CAI) lesson (scripting, programing, and testing) is reviewed. A project done by Informatics Education Ltd. (IEL) for the Department of National Defense (DND) is used as an example. (JT)

  2. Computer aided information system for a PWR

    International Nuclear Information System (INIS)

    Vaidian, T.A.; Karmakar, G.; Rajagopal, R.; Shankar, V.; Patil, R.K.

    1994-01-01

    The computer aided information system (CAIS) is designed with a view to improve the performance of the operator. CAIS assists the plant operator in an advisory and support role, thereby reducing the workload level and potential human errors. The CAIS as explained here has been designed for a PWR type KLT- 40 used in Floating Nuclear Power Stations (FNPS). However the underlying philosophy evolved in designing the CAIS can be suitably adopted for other type of nuclear power plants too (BWR, PHWR). Operator information is divided into three broad categories: a) continuously available information b) automatically available information and c) on demand information. Two in number touch screens are provided on the main control panel. One is earmarked for continuously available information and the other is dedicated for automatically available information. Both the screens can be used at the operator's discretion for on-demand information. Automatically available information screen overrides the on-demand information screens. In addition to the above, CAIS has the features of event sequence recording, disturbance recording and information documentation. CAIS design ensures that the operator is not overburdened with excess and unnecessary information, but at the same time adequate and well formatted information is available. (author). 5 refs., 4 figs

  3. The enhancement of students’ mathematical representation in junior high school using cognitive apprenticeship instruction (CAI)

    Science.gov (United States)

    Yusepa, B. G. P.; Kusumah, Y. S.; Kartasasmita, B. G.

    2018-03-01

    This study aims to get an in-depth understanding of the enhancement of students’ mathematical representation. This study is experimental research with pretest-posttest control group design. The subject of this study is the students’ of the eighth grade from junior high schools in Bandung: high-level and middle-level. In each school, two parallel groups were chosen as a control group and an experimental group. The experimental group was given cognitive apprenticeship instruction (CAI) treatment while the control group was given conventional learning. The results show that the enhancement of students’ mathematical representation who obtained CAI treatment was better than the conventional one, viewed which can be observed from the overall, mathematical prior knowledge (MPK), and school level. It can be concluded that CAI can be used as a good alternative learning model to enhance students’ mathematical representation.

  4. An Evaluation of the Cognitive and Affective Performance of an Integrated Set of CAI Materials in the Principles of Macroeconomics. Studies in Economic Education, No. 4.

    Science.gov (United States)

    Daellenbach, Lawrence A.; And Others

    The purpose of this study was to determine the effect of computer assisted instruction (CAI) on the cognitive and affective development of college students enrolled in a principles of macroeconomics course. The hypotheses of the experiment were stated as follows: In relation to the traditional principles course, the experimental treatment will…

  5. CAIS/ACSI 2001: Beyond the Web: Technologies, Knowledge and People.

    Science.gov (United States)

    Canadian Journal of Information and Library Science, 2000

    2000-01-01

    Presents abstracts of papers presented at the 29th Annual Conference of the Canadian Association for Information Science (CAIS) held in Quebec on May 27-29, 2001. Topics include: professional development; librarian/library roles; information technology uses; virtual libraries; information seeking behavior; literacy; information retrieval;…

  6. A Comparison of Computer-Assisted Instruction and Tutorials in Hematology and Oncology.

    Science.gov (United States)

    Garrett, T. J.; And Others

    1987-01-01

    A study comparing the effectiveness of computer-assisted instruction (CAI) and small group instruction found no significant difference in medical student achievement in oncology but higher achievement through small-group instruction in hematology. Students did not view CAI as more effective, but saw it as a supplement to traditional methods. (MSE)

  7. The effects of computer-assisted instruction on the mathematics performance and classroom behavior of children with ADHD.

    Science.gov (United States)

    Mautone, Jennifer A; DuPaul, George J; Jitendra, Asha K

    2005-08-01

    The present study examines the effects of computer-assisted instruction (CAI) on the mathematics performance and classroom behavior of three second-through fourth-grade students with ADHD. A controlled case study is used to evaluate the effects of the computer software on participants' mathematics performance and on-task behavior. Participants' mathematics achievement improve and their on-task behavior increase during the CAI sessions relative to independent seatwork conditions. In addition, students and teachers consider CAI to be an acceptable intervention for some students with ADHD who are having difficulty with mathematics. Implications of these results for practice and research are discussed.

  8. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  9. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.

  10. Developing Computer-Assisted Instruction Multimedia For Educational Technology Course of Coastal Area Students

    Science.gov (United States)

    Idris, Husni; Nurhayati, Nurhayati; Satriani, Satriani

    2018-05-01

    This research aims to a) identify instructional software (interactive multimedia CDs) by developing Computer-Assisted Instruction (CAI) multimedia that is eligible to be used in the instruction of the Educational Technology course; b) analysis the role of instructional software (interactive multimedia CDs) on the Educational Technology course through the development of Computer-Assisted Instruction (CAI) multimedia to improve the quality of education and instructional activities. This is Research and Development (R&D). It employed the descriptive procedural model of development, which outlines the steps to be taken to develop a product, which is instructional multimedia. The number of subjects of the research trial or respondents for each stage was 20 people. To maintain development quality, an expert in materials outside the materials under study, an expert in materials who is also a Educational Technology lecturer, a small groupof 3 students, a medium-sized group of 10 students, and 20 students to participate in the field testing took part in this research. Then, data collection instruments were developed in two stages, namely: a) developing the instruments; and b) trying out instruments. Data on students’ responses were collected using questionnaires and analyzed using descriptive statistics with percentage and categorization techniques. Based on data analysis results, it is revealed that the Computer-Assisted Instruction (CAI) multimedia developed and tried out among students during the preliminary field testing falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Subsequently, results of the main field testing among students also suggest that it falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Similarly, results of the operational field testing among students also suggest that it falls into the

  11. Student Study Choices in the Principles of Economics: A Case Study of Computer Usage

    OpenAIRE

    Grimes, Paul W.; Sanderson, Patricia L.; Ching, Geok H.

    1996-01-01

    Principles of Economics students at Mississippi State University were provided the opportunity to use computer assisted instruction (CAI) as a supplemental study activity. Students were free to choose the extent of their computer work. Throughout the course, weekly surveys were conducted to monitor the time each student spent with their textbook, computerized tutorials, workbook, class notes, and study groups. The surveys indicated that only a minority of the students actively pursued CAI....

  12. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    Science.gov (United States)

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  13. Consumption of fa cai Nostoc soup: a potential for BMAA exposure from Nostoc cyanobacteria in China?

    Science.gov (United States)

    Roney, Britton R; Renhui, Li; Banack, Sandra Anne; Murch, Susan; Honegger, Rosmarie; Cox, Paul Alan

    2009-01-01

    Grown in arid regions of western China the cyanobacterium Nostoc flagelliforme--called fa cai in Mandarin and fat choy in Cantonese--is wild-harvested and used to make soup consumed during New Year's celebrations. High prices, up to $125 USD/kg, led to overharvesting in Inner Mongolia, Ningxia, Gansu, Qinghai, and Xinjiang. Degradation of arid ecosystems, desertification, and conflicts between Nostoc harvesters and Mongol herdsmen concerned the Chinese environmental authorities, leading to a government ban of Nostoc commerce. This ban stimulated increased marketing of a substitute made from starch. We analysed samples purchased throughout China as well as in Chinese markets in the United States and the United Kingdom. Some were counterfeits consisting of dyed starch noodles. A few samples from California contained Nostoc flagelliforme but were adulterated with starch noodles. Other samples, including those from the United Kingdom, consisted of pure Nostoc flagelliforme. A recent survey of markets in Cheng Du showed no real Nostoc flagelliforme to be marketed. Real and artificial fa cai differ in the presence of beta-N-methylamino-L-alanine (BMAA). Given its status as a high-priced luxury food, the government ban on collection and marketing, and the replacement of real fa cai with starch substitutes consumed only on special occasions, it is anticipated that dietary exposure to BMAA from fa cai will be reduced in the future in China.

  14. Hunting and use of terrestrial fauna used by Caiçaras from the Atlantic Forest coast (Brazil

    Directory of Open Access Journals (Sweden)

    Alves Rômulo RN

    2009-11-01

    Full Text Available Abstract Background The Brazilian Atlantic Forest is considered one of the hotspots for conservation, comprising remnants of rain forest along the eastern Brazilian coast. Its native inhabitants in the Southeastern coast include the Caiçaras (descendants from Amerindians and European colonizers, with a deep knowledge on the natural resources used for their livelihood. Methods We studied the use of the terrestrial fauna in three Caiçara communities, through open-ended interviews with 116 native residents. Data were checked through systematic observations and collection of zoological material. Results The dependence on the terrestrial fauna by Caiçaras is especially for food and medicine. The main species used are Didelphis spp., Dasyprocta azarae, Dasypus novemcinctus, and small birds (several species of Turdidae. Contrasting with a high dependency on terrestrial fauna resources by native Amazonians, the Caiçaras do not show a constant dependency on these resources. Nevertheless, the occasional hunting of native animals represents a complimentary source of animal protein. Conclusion Indigenous or local knowledge on native resources is important in order to promote local development in a sustainable way, and can help to conserve biodiversity, particularly if the resource is sporadically used and not commercially exploited.

  15. CAD/CAM/CAI Application for High-Precision Machining of Internal Combustion Engine Pistons

    Directory of Open Access Journals (Sweden)

    V. V. Postnov

    2014-07-01

    Full Text Available CAD/CAM/CAI application solutions for internal combustion engine pistons machining was analyzed. Low-volume technology of internal combustion engine pistons production was proposed. Fixture for CNC turning center was designed.

  16. Using the Computer to Improve Basic Skills.

    Science.gov (United States)

    Bozeman, William; Hierstein, William J.

    These presentations offer information on the benefits of using computer-assisted instruction (CAI) for remedial education. First, William J. Hierstein offers a summary of the Computer Assisted Basic Skills Project conducted by Southeastern Community College at the Iowa State Penitentiary. Hierstein provides background on the funding for the…

  17. Students' perceptions of a multimedia computer-aided instruction ...

    African Journals Online (AJOL)

    Objective. To develop an interactive muttimedia-based computer-aided instruction (CAI) programme, to detennine its educational worth and efficacy in a multicuttural academic environment and to evaluate its usage by students with differing levels of computer literacy. Design. A prospective descriptive study evaluating ...

  18. Changes in flavour and microbial diversity during natural fermentation of suan-cai, a traditional food made in Northeast China.

    Science.gov (United States)

    Wu, Rina; Yu, Meiling; Liu, Xiaoyu; Meng, Lingshuai; Wang, Qianqian; Xue, Yating; Wu, Junrui; Yue, Xiqing

    2015-10-15

    We measured changes in the main physical and chemical properties, flavour compounds and microbial diversity in suan-cai during natural fermentation. The results showed that the pH and concentration of soluble protein initially decreased but were then maintained at a stable level; the concentration of nitrite increased in the initial fermentation stage and after reaching a peak it decreased significantly to a low level by the end of fermentation. Suan-cai was rich in 17 free amino acids. All of the free amino acids increased in concentration to different degrees, except histidine. Total free amino acids reached their highest levels in the mid-fermentation stage. The 17 volatile flavour components identified at the start of fermentation increased to 57 by the mid-fermentation stage; esters and aldehydes were in the greatest diversity and abundance, contributing most to the aroma of suan-cai. Bacteria were more abundant and diverse than fungi in suan-cai; 14 bacterial species were identified from the genera Leuconostoc, Bacillus, Pseudomonas and Lactobacillus. The predominant fungal species identified were Debaryomyces hansenii, Candida tropicalis and Penicillium expansum. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Computer training, present and future.

    Science.gov (United States)

    Smith, G. A.

    1972-01-01

    The products of educational firms today lead toward a multimedia approach to the education and training of commercial programmers and systems analysts. Computer-assisted instruction or CAI is a relatively new medium to augment the other media. The government use of computers is discussed together with the importance of computer pretests. Pretests can aid in determining a person's ability to absorb a particular instructional level. The material presented in a number of computer courses is listed.

  20. The Impact of Computer Assisted Instruction As It Relates to Learning Disabled Adults in California Community Colleges.

    Science.gov (United States)

    Brower, Mary Jo

    A study was conducted to determine the advantages and disadvantages of using computer-assisted instruction (CAI) with learning disabled (LD) adults attending California community colleges. A questionnaire survey of the directors of the LD programs solicited information on the availability of CAI for LD adults, methods of course advertisement,…

  1. Calcium-aluminum-rich inclusions with fractionation and unknown nuclear effects (FUN CAIs)

    DEFF Research Database (Denmark)

    Krot, Alexander N.; Nagashima, Kazuhide; Wasserburg, Gerald J.

    2014-01-01

    We present a detailed characterization of the mineralogy, petrology, and oxygen isotopic compositions of twelve FUN CAIs, including C1 and EK1-4-1 from Allende (CV), that were previously shown to have large isotopic fractionation patterns for magnesium and oxygen, and large isotopic anomalies...

  2. The Range of Initial 10Be/9Be Ratios in the Early Solar System: A Re-Assessment Based on Analyses of New CAIs and Melilite Composition Glass Standards

    Science.gov (United States)

    Dunham, E.; Wadhwa, M.; Liu, M.-C.

    2017-07-01

    We report a more accurate range of initial 10Be/9Be in CAIs including FUN CAI CMS-1 from Allende (CV3) and a new CAI from NWA 5508 (CV3) using melilite composition glass standards; we suggest 10Be is largely produced by irradiation in the nebula.

  3. Numerical investigation of CAI Combustion in the Opposed- Piston Engine with Direct and Indirect Water Injection

    Science.gov (United States)

    Pyszczek, R.; Mazuro, P.; Teodorczyk, A.

    2016-09-01

    This paper is focused on the CAI combustion control in a turbocharged 2-stroke Opposed-Piston (OP) engine. The barrel type OP engine arrangement is of particular interest for the authors because of its robust design, high mechanical efficiency and relatively easy incorporation of a Variable Compression Ratio (VCR). The other advantage of such design is that combustion chamber is formed between two moving pistons - there is no additional cylinder head to be cooled which directly results in an increased thermal efficiency. Furthermore, engine operation in a Controlled Auto-Ignition (CAI) mode at high compression ratios (CR) raises a possibility of reaching even higher efficiencies and very low emissions. In order to control CAI combustion such measures as VCR and water injection were considered for indirect ignition timing control. Numerical simulations of the scavenging and combustion processes were performed with the 3D CFD multipurpose AVL Fire solver. Numerous cases were calculated with different engine compression ratios and different amounts of directly and indirectly injected water. The influence of the VCR and water injection on the ignition timing and engine performance was determined and their application in the real engine was discussed.

  4. Microstructures of Hibonite From an ALH A77307 (CO3.0) CAI: Evidence for Evaporative Loss of Calcium

    Science.gov (United States)

    Han, Jangmi; Brearley, Adrian J.; Keller, Lindsay P.

    2014-01-01

    Hibonite is a comparatively rare, primary phase found in some CAIs from different chondrite groups and is also common in Wark-Lovering rims [1]. Hibonite is predicted to be one of the earliest refractory phases to form by equilibrium condensation from a cooling gas of solar composition [2] and, therefore, can be a potential recorder of very early solar system processes. In this study, we describe the microstructures of hibonite from one CAI in ALH A77307 (CO3.0) using FIB/TEM techniques in order to reconstruct its formational history.

  5. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  6. Interactive Computer Lessons for Introductory Economics: Guided Inquiry-From Supply and Demand to Women in the Economy.

    Science.gov (United States)

    Miller, John; Weil, Gordon

    1986-01-01

    The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)

  7. CO-Bridged H-Cluster Intermediates in the Catalytic Mechanism of [FeFe]-Hydrogenase CaI.

    Science.gov (United States)

    Ratzloff, Michael W; Artz, Jacob H; Mulder, David W; Collins, Reuben T; Furtak, Thomas E; King, Paul W

    2018-06-20

    The [FeFe]-hydrogenases ([FeFe] H 2 ases) catalyze reversible H 2 activation at the H-cluster, which is composed of a [4Fe-4S] H subsite linked by a cysteine thiolate to a bridged, organometallic [2Fe-2S] ([2Fe] H ) subsite. Profoundly different geometric models of the H-cluster redox states that orchestrate the electron/proton transfer steps of H 2 bond activation have been proposed. We have examined this question in the [FeFe] H 2 ase I from Clostridium acetobutylicum (CaI) by Fourier-transform infrared (FTIR) spectroscopy with temperature annealing and H/D isotope exchange to identify the relevant redox states and define catalytic transitions. One-electron reduction of H ox led to formation of H red H + ([4Fe-4S] H 2+ -Fe I -Fe I ) and H red ' ([4Fe-4S] H 1+ -Fe II -Fe I ), with both states characterized by low frequency μ-CO IR modes consistent with a fully bridged [2Fe] H . Similar μ-CO IR modes were also identified for H red H + of the [FeFe] H 2 ase from Chlamydomonas reinhardtii (CrHydA1). The CaI proton-transfer variant C298S showed enrichment of an H/D isotope-sensitive μ-CO mode, a component of the hydride bound H-cluster IR signal, H hyd . Equilibrating CaI with increasing amounts of NaDT, and probed at cryogenic temperatures, showed H red H + was converted to H hyd . Over an increasing temperature range from 10 to 260 K catalytic turnover led to loss of H hyd and appearance of H ox , consistent with enzymatic turnover and H 2 formation. The results show for CaI that the μ-CO of [2Fe] H remains bridging for all of the "H red " states and that H red H + is on pathway to H hyd and H 2 evolution in the catalytic mechanism. These results provide a blueprint for designing small molecule catalytic analogs.

  8. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  9. Motivation in computer-assisted instruction.

    Science.gov (United States)

    Hu, Amanda; Shewokis, Patricia A; Ting, Kimberly; Fung, Kevin

    2016-08-01

    Computer-aided instruction (CAI) is defined as instruction in which computers play a central role as the means of information delivery and direct interaction with learners. Computer-aided instruction has become mainstream in medical school curricula. For example, a three-dimensional (3D) computer module of the larynx has been created to teach laryngeal anatomy. Although the novelty and educational potential of CAI has garnered much attention, these new technologies have been plagued with low utilization rates. Several experts attribute this problem to lack of motivation in students. Motivation is defined as the desire and action toward goal-oriented behavior. Psychologist Dr. John Keller developed the ARCS theory of motivational learning, which proposed four components: attention (A), relevance (R), concentration (C), and satisfaction (S). Keller believed that motivation is not only an innate characteristic of the pupil; it can also be influenced by external factors, such as the instructional design of the curriculum. Thus, understanding motivation is an important step to designing CAI appropriately. Keller also developed a 36-item validated instrument called the Instructional Materials Motivation Survey (IMMS) to measure motivation. The objective of this study was to study motivation in CAI. Medical students learning anatomy with the 3D computer module will have higher laryngeal anatomy test scores and higher IMMS motivation scores. Higher anatomy test scores will be positively associated with higher IMMS scores. Prospective, randomized, controlled trial. After obtaining institutional review board approval, 100 medical students (mean age 25.5 ± 2.5, 49% male) were randomized to either the 3D computer module (n = 49) or written text (n = 51). Information content was identical in both arms. Students were given 30 minutes to study laryngeal anatomy and then completed the laryngeal anatomy test and IMMS. Students were categorized as either junior (year 1

  10. Conhecimento e uso de plantas em uma comunidade caiçara do litoral sul do Estado do Rio de Janeiro, Brasil Knowledge and use of plants in a Caiçara community located on the southern coast of Rio de Janeiro State, Brazil

    Directory of Open Access Journals (Sweden)

    Rodrigo Borges

    2009-09-01

    Full Text Available A Área de Proteção Ambiental de Cairuçu (APA localiza-se no município de Paraty, RJ. É uma unidade de conservação de uso sustentável e dispõe-se a proteger o ambiente natural e as comunidades caiçaras da região. O objetivo deste estudo foi realizar um inventário etnobotânico das plantas conhecidas e utilizadas pela comunidade caiçara que habita a praia de Martim de Sá. Moram no local 30 pessoas das quais 10 foram entrevistadas. As informações etnobotânicas foram obtidas através da observação participante e entrevistas semi-estruturadas. O material botânico coletado foi depositado no Herbário do Instituto de Pesquisas Jardim Botânico do Rio de Janeiro (RB. Foram identificadas 76 espécies pertencentes a 59 gêneros e 30 famílias botânicas consideradas úteis pelos caiçaras. As três espécies mais citadas foram: Sloanea obtusifolia (Sapopema, Scherolobium denudatum (Ingá-ferro e Balizia pedicelaris (Timbuíba. Utilizou-se o Índice de Shannon (H' = 1,81 - base 10 para a análise da diversidade de espécies. O registro sobre o uso dos recursos vegetais na comunidade estudada fornece informações que podem ser utilizadas para programas de conservação baseados no conhecimento local do ambiente.The Cairuçu Environmental Protection Area (APA was created to help assure the protection of the natural environment and its sustainable use by the caiçara communities in the region. This work presents an ethnobotanical inventory of the plants known and used by the caiçara community living on Martim de Sá beach in Paraty municipality, RJ. Thirty people live in the locality and ten of them were interviewed. Ethnobotanical information was obtained through participatory observations and semi-structured interviews with the local residents. All botanical material collected was deposited in the herbarium of the Instituto de Pesquisas Jardim Botânico do Rio de Janeiro (RB. A total of 76 species belonging to 59 genera and 30

  11. CaI and SrI molecules for iodine determination by high-resolution continuum source graphite furnace molecular absorption spectrometry: Greener molecules for practical application.

    Science.gov (United States)

    Zanatta, Melina Borges Teixeira; Nakadi, Flávio Venâncio; da Veiga, Márcia Andreia Mesquita Silva

    2018-03-01

    A new method to determine iodine in drug samples by high-resolution continuum source graphite furnace molecular absorption spectrometry (HR-CS GF MAS) has been developed. The method measures the molecular absorption of a diatomic molecule, CaI or SrI (less toxic molecule-forming reagents), at 638.904 or 677.692nm, respectively, and uses a mixture containing 5μg of Pd and 0.5μg of Mg as chemical modifier. The method employs pyrolysis temperatures of 1000 and 800°C and vaporization temperatures of 2300 and 2400°C for CaI and SrI, respectively. The optimized amounts of Ca and Sr as molecule-forming reagents are 100 and 150µg, respectively. On the basis of interference studies, even small chlorine concentrations reduce CaI and SrI absorbance significantly. The developed method was used to analyze different commercial drug samples, namely thyroid hormone pills with three different iodine amounts (15.88, 31.77, and 47.66µg) and one liquid drug with 1% m v -1 active iodine in their compositions. The results agreed with the values informed by the manufacturers (95% confidence level) regardless of whether CaI or SrI was determined. Therefore, the developed method is useful for iodine determination on the basis of CaI or SrI molecular absorption. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Effects of Using Simultaneous Prompting and Computer-Assisted Instruction during Small Group Instruction

    Science.gov (United States)

    Ozen, Arzu; Ergenekon, Yasemin; Ulke-Kurkcuoglu, Burcu

    2017-01-01

    The current study investigated the relation between simultaneous prompting (SP), computer-assisted instruction (CAI), and the receptive identification of target pictures (presented on laptop computer) for four preschool students with developmental disabilities. The students' acquisition of nontarget information through observational learning also…

  13. Effectiveness of a computer-based tutorial for teaching how to make a blood smear.

    Science.gov (United States)

    Preast, Vanessa; Danielson, Jared; Bender, Holly; Bousson, Maury

    2007-09-01

    Computer-aided instruction (CAI) was developed to teach veterinary students how to make blood smears. This instruction was intended to replace the traditional instructional method in order to promote efficient use of faculty resources while maintaining learning outcomes and student satisfaction. The purpose of this study was to evaluate the effect of a computer-aided blood smear tutorial on 1) instructor's teaching time, 2) students' ability to make blood smears, and 3) students' ability to recognize smear quality. Three laboratory sessions for senior veterinary students were taught using traditional methods (control group) and 4 sessions were taught using the CAI tutorial (experimental group). Students in the control group received a short demonstration and lecture by the instructor at the beginning of the laboratory and then practiced making blood smears. Students in the experimental group received their instruction through the self-paced, multimedia tutorial on a laptop computer and then practiced making blood smears. Data was collected from observation, interview, survey questionnaires, and smear evaluation by students and experts using a scoring rubric. Students using the CAI made better smears and were better able to recognize smear quality. The average time the instructor spent in the room was not significantly different between groups, but the quality of the instructor time was improved with the experimental instruction. The tutorial implementation effectively provided students and instructors with a teaching and learning experience superior to the traditional method of instruction. Using CAI is a viable method of teaching students to make blood smears.

  14. Calcium and Titanium Isotope Fractionation in CAIS: Tracers of Condensation and Inheritance in the Early Solar Protoplanetary Disk

    Science.gov (United States)

    Simon, J. I.; Jordan, M. K.; Tappa, M. J.; Kohl, I. E.; Young, E. D.

    2016-01-01

    The chemical and isotopic compositions of calcium-aluminum-rich inclusions (CAIs) can be used to understand the conditions present in the protoplantary disk where they formed. The isotopic compositions of these early-formed nebular materials are largely controlled by chemical volatility. The isotopic effects of evaporation/sublimation, which are well explained by both theory and experimental work, lead to enrichments of the heavy isotopes that are often exhibited by the moderately refractory elements Mg and Si. Less well understood are the isotopic effects of condensation, which limits our ability to determine whether a CAI is a primary condensate and/or retains any evidence of its primordial formation history.

  15. Calcium-aluminum-rich inclusions with fractionation and unidentified nuclear effects (FUN CAIs): II. Heterogeneities of magnesium isotopes and 26Al in the early Solar System inferred from in situ high-precision magnesium-isotope measurements

    Science.gov (United States)

    Park, Changkun; Nagashima, Kazuhide; Krot, Alexander N.; Huss, Gary R.; Davis, Andrew M.; Bizzarro, Martin

    2017-03-01

    Calcium-aluminum-rich inclusions with isotopic mass fractionation effects and unidentified nuclear isotopic anomalies (FUN CAIs) have been studied for more than 40 years, but their origins remain enigmatic. Here we report in situ high precision measurements of aluminum-magnesium isotope systematics of FUN CAIs by secondary ion mass spectrometry (SIMS). Individual minerals were analyzed in six FUN CAIs from the oxidized CV3 carbonaceous chondrites Axtell (compact Type A CAI Axtell 2271) and Allende (Type B CAIs C1 and EK1-4-1, and forsterite-bearing Type B CAIs BG82DH8, CG-14, and TE). Most of these CAIs show evidence for excess 26Mg due to the decay of 26Al. The inferred initial 26Al/27Al ratios [(26Al/27Al)0] and the initial magnesium isotopic compositions (δ26Mg0) calculated using an exponential law with an exponent β of 0.5128 are (3.1 ± 1.6) × 10-6 and 0.60 ± 0.10‰ (Axtell 2271), (3.7 ± 1.5) × 10-6 and -0.20 ± 0.05‰ (BG82DH8), (2.2 ± 1.1) × 10-6 and -0.18 ± 0.05‰ (C1), (2.3 ± 2.4) × 10-5 and -2.23 ± 0.37‰ (EK1-4-1), (1.5 ± 1.1) × 10-5 and -0.42 ± 0.08‰ (CG-14), and (5.3 ± 0.9) × 10-5 and -0.05 ± 0.08‰ (TE) with 2σ uncertainties. We infer that FUN CAIs recorded heterogeneities of magnesium isotopes and 26Al in the CAI-forming region(s). Comparison of 26Al-26Mg systematics, stable isotope (oxygen, magnesium, calcium, and titanium) and trace element studies of FUN and non-FUN igneous CAIs indicates that there is a continuum among these CAI types. Based on these observations and evaporation experiments on CAI-like melts, we propose a generic scenario for the origin of igneous (FUN and non-FUN) CAIs: (i) condensation of isotopically normal solids in an 16O-rich gas of approximately solar composition; (ii) formation of CAI precursors by aggregation of these solids together with variable abundances of isotopically anomalous grains-possible carriers of unidentified nuclear (UN) effects; and (iii) melt evaporation of these precursors

  16. Attitudes of health care students about computer-aided neuroanatomy instruction.

    Science.gov (United States)

    McKeough, D Michael; Bagatell, Nancy

    2009-01-01

    This study examined students' attitudes toward computer-aided instruction (CAI), specifically neuroanatomy learning modules, to assess which components were primary in establishing these attitudes and to discuss the implications of these attitudes for successfully incorporating CAI in the preparation of health care providers. Seventy-seven masters degree, entry-level, health care professional students matriculated in an introductory neuroanatomy course volunteered as subjects for this study. Students independently reviewed the modules as supplements to lecture and completed a survey to evaluate teaching effectiveness. Responses to survey statements were compared across the learning modules to determine if students viewed the modules differently. Responses to individual survey statements were averaged to measure the strength of agreement or disagreement with the statement. Responses to open-ended questions were theme coded, and frequencies and percentages were calculated for each. Students saw no differences between the learning modules. Students perceived the learning modules as valuable; they enjoyed using the modules but did not prefer CAI over traditional lecture format. The modules were useful in learning or reinforcing neuroanatomical concepts and improving clinical problem-solving skills. Students reported that the visual representation of the neuroanatomical systems, computer animation, ability to control the use of the modules, and navigational fidelity were key factors in determining attitudes. The computer-based learning modules examined in this study were effective as adjuncts to lecture in helping entry-level health care students learn and make clinical applications of neuroanatomy information.

  17. If You Meet the Computer Guru on the Road, Kill Him (or Her).

    Science.gov (United States)

    Gore, Kay

    1989-01-01

    Discusses problems and misconceptions concerning the appropriate use of computers in K-12 classrooms. The use of software to support computer-assisted instruction (CAI) is described, teacher-written software is discussed, telecommunications issues are considered, and the role of administrators and teachers is examined. (two references) (LRW)

  18. Computer-Assisted Instruction and Continuing Motivation.

    Science.gov (United States)

    Mosley, Mary Lou; And Others

    Effects of two feedback conditions--comment and no comment--on the motivation of sixth grade students to continue with computer assisted instruction (CAI) were investigated, and results for boys and for girls were compared. Subjects were 62 students--29 boys and 33 girls--from a suburban elementary school who were randomly assigned to the comment…

  19. Effect of Tutorial Mode of Computer-Assisted Instruction on Students ...

    African Journals Online (AJOL)

    This study investigated the effect of Tutorial Mode of Computer- Assisted Instruction (CAI) on students' academic performance in practical geography in Nigeria, However, the sample population of eighty (80) Senior Secondary School Two geography students that were randomly selected from two privately owned secondary ...

  20. Changes of Benthic Macroinvertebrates in Thi Vai River and Cai Mep Estuaries Under Polluted Conditions with Industrial Wastewater

    Directory of Open Access Journals (Sweden)

    Huong Nguyen Thi Thanh

    2017-06-01

    Full Text Available The pollution on the Thi Vai River has been spreading out rapidly over the two lasted decades caused by the wastewater from the industrial parks in the left bank of Thi Vai River and Cai Mep Estuaries. The evaluation of the benthic macroinvertebrate changes was very necessary to identify the consequences of the industrial wastewater on water quality and aquatic ecosystem of Thi Vai River and Cai Mep Estuaries. In this study, the variables of benthic macroinvertebrates and water quality were investigated in Thi Vai River and Cai Mep Estuaries, Southern Vietnam. The monitoring data of benthic macroinvertebrates and water quality parameters covered the period from 1989 to 2015 at 6 sampling sites in Thi Vai River and Cai Mep Estuaries. The basic water quality parameters were also tested including pH, dissolved oxygen (DO, total nitrogen, and total phosphorus. The biodiversity indices of benthic macroinvertebrates were applied for water quality assessment. The results showed that pH ranged from 6.4 – 7.6 during the monitoring. The DO concentrations were in between 0.20 - 6.70 mg/L. The concentrations of total nitrogen and total phosphorous ranged from 0.03 - 5.70 mg/L 0.024 - 1.380 mg/L respectively. Macroinvertebrate community in the study area consisted of 36 species of polychaeta, gastropoda, bivalvia, and crustacea, of which, species of polychaeta were dominant in species number. The benthic macroinvertebartes density ranged from 0 - 2.746 individuals/m−1 with the main dominant species of Neanthes caudata, Prionospio malmgreni, Paraprionospio pinnata, Trichochaeta carica, Maldane sarsi, Capitella capitata, Terebellides stroemi, Euditylia polymorpha, Grandidierella lignorum, Apseudes vietnamensis. The biodiversity index values during the monitoring characterized for aquatic environmental conditions of mesotrophic to polytrophic. Besides, species richness positively correlated with DO, total nitrogen, and total phosphorus. The results

  1. On native Danish learners' challenges in distinguishing /tai/, /cai/ and /zai/

    DEFF Research Database (Denmark)

    Sloos, Marjoleine; Zhang, Chun

    2015-01-01

    University participated in an ABX experiment. They were auditorily presented pairs of the critical stimuli tai-cai-zai, te-ce-ze and tuo-cuo-zuo combined with all four tones and alternated with fillers. The subjects indicated for each pair which of the two words matched the pinyin description. The expected...... results show that beginner learners perform on chance level regarding the distinction between t and z and between c and z. The reason is that in Danish, which has an aspiration contrast between plosives (like Chinese) /th/ is variably pronounced as affricated /ts/ and many speakers are unaware...

  2. Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.

    Science.gov (United States)

    Edwards, Thomas O.

    The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…

  3. Bird community structure in riparian environments in Cai River, Rio Grande do Sul, Brazil

    OpenAIRE

    Jaqueline Brummelhaus; Marcia Suelí Bohn; Maria Virginia Petry

    2012-01-01

    Urbanization produces changes in riparian environments, causing effects in the structure of bird communities, which present different responses to the impacts. We compare species richness, abundance, and composition of birds in riparian environments with different characteristics in Cai River, Rio Grande do Sul, Brazil. We carried out observations in woodland, grassland, and urban environments, between September 2007 and August 2008. We listed 130 bird species, 29 species unique to woodland e...

  4. Computer-aided instruction system

    International Nuclear Information System (INIS)

    Teneze, Jean Claude

    1968-01-01

    This research thesis addresses the use of teleprocessing and time sharing by the RAX IBM system and the possibility to introduce a dialog with the machine to develop an application in which the computer plays the role of a teacher for different pupils at the same time. Two operating modes are thus exploited: a teacher-mode and a pupil-mode. The developed CAI (computer-aided instruction) system comprises a checker to check the course syntax in teacher-mode, a translator to trans-code the course written in teacher-mode into a form which can be processes by the execution programme, and the execution programme which presents the course in pupil-mode

  5. Dietetics students' ability to choose appropriate communication and counseling methods is improved by teaching behavior-change strategies in computer-assisted instruction.

    Science.gov (United States)

    Puri, Ruchi; Bell, Carol; Evers, William D

    2010-06-01

    Several models and theories have been proposed to help registered dietitians (RD) counsel and communicate nutrition information to patients. However, there is little time for students or interns to observe and/or participate in counseling sessions. Computer-assisted instruction (CAI) can be used to give students more opportunity to observe the various methods and theories of counseling. This study used CAI simulations of RD-client communications to examine whether students who worked through the CAI modules would choose more appropriate counseling methods. Modules were created based on information from experienced RD. They contained videos of RD-patient interactions and demonstrated helpful and less helpful methods of communication. Students in didactic programs in dietetics accessed the modules via the Internet. The intervention group of students received a pretest module, two tutorial modules, and a posttest module. The control group only received the pretest and posttest modules. Data were collected during three semesters in 2006 and 2007. Two sample t tests were used to compare pretest and posttest scores. The influence of other factors was measured using factorial analysis of variance. Statistical significance was set at Pcommunication and counseling methods for dietetics students. 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  6. Investigating the Effects of Computer-Assisted Instruction on Achievement and Attitudes towards Mathematics among Seventh-Grade Students in Kuwait

    Science.gov (United States)

    Soliman, Mamdouh M.; Hilal, Ahmed J.

    2016-01-01

    This study evaluates the effectiveness of Computer-Assisted Instruction (CAI) compared with traditional classroom instruction of mathematics of seventh graders in Kuwait's public schools. We aimed to compare students learning outcomes between two groups: the control group, taught traditionally without the use of computers, and the experimental…

  7. Multiple Nebular Gas Reservoirs Recorded by Oxygen Isotope Variation in a Spinel-Rich CAI in CO3 MIL 090019

    Science.gov (United States)

    Simon, J. I.; Simon, S. B.; Nguyen, A. N.; Ross, D. K.; Messenger, S.

    2017-07-01

    We conducted NanoSIMS ion imaging studies of a primitive spinel-rich CAI from the MIL 090019 CO3 chondrite. It records radial O-isotopic heterogeneity among multiple occurrences of the same mineral, reflecting distinct nebular O-isotopic reservoirs.

  8. Complete genome sequence of Defluviimonas alba cai42T, a microbial exopolysaccharides producer.

    Science.gov (United States)

    Zhao, Jie-Yu; Geng, Shuang; Xu, Lian; Hu, Bing; Sun, Ji-Quan; Nie, Yong; Tang, Yue-Qin; Wu, Xiao-Lei

    2016-12-10

    Defluviimonas alba cai42 T , isolated from the oil-production water in Xinjiang Oilfield in China, has a strong ability to produce exopolysaccharides (EPS). We hereby present its complete genome sequence information which consists of a circular chromosome and three plasmids. The strain characteristically contains various genes encoding for enzymes involved in EPS biosynthesis, modification, and export. According to the genomic and physiochemical data, it is predicted that the strain has the potential to be utilized in industrial production of microbial EPS. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Teaching French Transformational Grammar by Means of Computer-Generated Video-Tapes.

    Science.gov (United States)

    Adler, Alfred; Thomas, Jean Jacques

    This paper describes a pilot program in an integrated media presentation of foreign languages and the production and usage of seven computer-generated video tapes which demonstrate various aspects of French syntax. This instructional set could form the basis for CAI lessons in which the student is presented images identical to those on the video…

  10. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  11. A Randomized Trial of Two Promising Computer-Based Interventions for Students with Attention Difficulties

    Science.gov (United States)

    Rabiner, David L.; Murray, Desiree W.; Skinner, Ann T; Malone, Patrick S.

    2010-01-01

    Few studies have examined whether attention can be improved with training, even though attention difficulties adversely affect academic achievement. The present study was a randomized-controlled trial evaluating the impact of Computerized Attention Training (CAT) and Computer Assisted Instruction (CAI) on attention and academic performance in 77…

  12. New directions in e-learning research in health professions education: Report of two symposia.

    Science.gov (United States)

    Triola, Marc M; Huwendiek, Sören; Levinson, Anthony J; Cook, David A

    2012-01-01

    The use of Computer Assisted Instruction (CAI) is rising across health professions education. Research to date is of limited use in guiding the implementation and selection of CAI innovations. In the context of two symposia, systemic reviews were discussed that evaluate literature in Internet-based learning, Virtual Patients, and animations. Each session included a debate with the goal of reaching consensus on best current practices and future research. Thematic analysis of the discussions was performed to arrange the questions by theme, eliminate redundancy, and craft them into a cohesive narrative. The question analysis revealed that there are clear advantages to the use of CAI, and that established educational theories should certainly inform the future development and selection of CAI tools. Schools adopting CAI need to carefully consider the benefits, cost, available resources, and capacity for teachers and learners to accept change in their practice of education. Potential areas for future research should focus on the effectiveness of CAI instructional features, integration of e-learning into existing curricula and with other modalities like simulation, and the use of CAI in assessment of higher-level outcomes. There are numerous opportunities for future research and it will be important to achieve consensus on important themes.

  13. From Corporate Social Responsibility, through Entrepreneurial Orientation, to Knowledge Sharing: A Study in Cai Luong (Renovated Theatre) Theatre Companies

    Science.gov (United States)

    Tuan, Luu Trong

    2015-01-01

    Purpose: This paper aims to examine the role of antecedents such as corporate social responsibility (CSR) and entrepreneurial orientation in the chain effect to knowledge sharing among members of Cai Luong theatre companies in the Vietnamese context. Knowledge sharing contributes to the depth of the knowledge pool of both the individuals and the…

  14. Crystal Growth and Scintillation Properties of Eu2+ doped Cs4CaI6 and Cs4SrI6

    Science.gov (United States)

    Stand, L.; Zhuravleva, M.; Chakoumakos, B.; Johnson, J.; Loyd, M.; Wu, Y.; Koschan, M.; Melcher, C. L.

    2018-03-01

    In this work we present the crystal growth and scintillation properties of two new ternarymetal halide scintillators activated with divalent europium, Cs4CaI6 and Cs4SrI6. Single crystals of each compound were grown in evacuated quartz ampoules via the vertical Bridgman technique using a two-zone transparent furnace. Single crystal X-ray diffraction experiments showed that both crystals have a trigonal (R-3c) structure, with a density of 3.99 g/cm3 and 4.03 g/cm3. The radioluminescence and photoluminescence measurements showed typical luminescence properties due to the 5d-4f radiative transitions in Eu2+. At this early stage of development Cs4SrI6:Eu and Cs4CaI6:Eu have shown very promising scintillation properties, with light yields and energy resolutions of 62,300 ph/MeV and 3.3%, and 51,800 photons/MeV and 3.6% at 662 keV, respectively.

  15. Learning Auditory Discrimination with Computer-Assisted Instruction: A Comparison of Two Different Performance Objectives.

    Science.gov (United States)

    Steinhaus, Kurt A.

    A 12-week study of two groups of 14 college freshmen music majors was conducted to determine which group demonstrated greater achievement in learning auditory discrimination using computer-assisted instruction (CAI). The method employed was a pre-/post-test experimental design using subjects randomly assigned to a control group or an experimental…

  16. Establishing Computer-Assisted Instruction to Teach Academics to Students with Autism as an Evidence-Based Practice

    Science.gov (United States)

    Root, Jenny R.; Stevenson, Bradley S.; Davis, Luann Ley; Geddes-Hall, Jennifer; Test, David W.

    2017-01-01

    Computer-assisted instruction (CAI) is growing in popularity and has demonstrated positive effects for students with disabilities, including those with autism spectrum disorder (ASD). In this review, criteria for group experimental and single case studies were used to determine quality (Horner et al., "Exceptional Children" 71:165-179,…

  17. Two years since SSAMS: Status of {sup 14}C AMS at CAIS

    Energy Technology Data Exchange (ETDEWEB)

    Ravi Prasad, G.V.; Cherkinsky, Alexander; Culp, Randy A.; Dvoracek, Doug K.

    2015-10-15

    The NEC 250 kV single stage AMS accelerator (SSAMS) was installed two years ago at the Center for Applied Isotope Studies (CAIS), University of Georgia. The accelerator is primarily being used for radiocarbon measurements to test the authenticity of natural and bio-based samples while all other samples such as geological, atmospheric, marine and archaeological. are run on the 500 kV, NEC 1.5SDH-1 model tandem accelerator, which has been operating since 2001. The data obtained over a six months period for OXI, OXII, ANU sucrose and FIRI-D are discussed. The mean value of ANU sucrose observed to be slightly lower than the consensus value. The processed blanks on SSAMS produce lower apparent age compared to the tandem accelerator as expected.

  18. The Motivational Effects of Types of Computer Feedback on Children's Learning and Retention of Relational Concepts.

    Science.gov (United States)

    Armour-Thomas, Eleanor; And Others

    The effects of different types of feedback in computer assisted instruction (CAI) on relational concept learning by young children were compared in this study. Subjects were 89 kindergarten students whose primary language was English, and whose performance on the Boehm Test of Basic Concepts was within the average range chosen from classes in a…

  19. Effects of the Memorization of Rule Statements on Performance, Retention, and Transfer in a Computer-Based Learning Task.

    Science.gov (United States)

    Towle, Nelson J.

    Research sought to determine whether memorization of rule statements before, during or after instruction in rule application skills would facilitate the acquisition and/or retention of rule-governed behavior as compared to no-rule statement memorization. A computer-assisted instructional (CAI) program required high school students to learn to a…

  20. Uncovering driving forces on greenhouse gas emissions in China’ aluminum industry from the perspective of life cycle analysis

    International Nuclear Information System (INIS)

    Liu, Zhe; Geng, Yong; Adams, Michelle; Dong, Liang; Sun, Lina; Zhao, Jingjing; Dong, Huijuan; Wu, Jiao; Tian, Xu

    2016-01-01

    Highlights: • Energy-related GHG emission trajectories, features and driving forces of CAI are analyzed from the perspective of LCA. • CAI experienced a rapid growth of energy-related GHG emissions from 2004 to 2013. • Energy-scale effect is the main driving force for energy-related GHG emissions increase in CAI. • Construction and transportation-related activities account for more than 40% of the total embodied emissions. • Policy implications such as developing secondary aluminum industry, improving energy mix etc, are raised. - Abstract: With the rapid growth of aluminum production, reducing greenhouse gas (GHG) emissions in China’s aluminum industry (CAI) is posing a significant challenge. In this study, the energy-related GHG emission trajectories, features and driving forces of CAI are analyzed from the perspective of life cycle analysis (LCA) from 2004 to 2013. Results indicate that CAI experienced a rapid growth of energy-related GHG emissions with an average annual growth of 28.5 million tons CO_2e from 2004 to 2013. Energy-scale effect is the main driving force for energy-related GHG emissions increase in CAI, while emission-factor effect of secondary aluminum production plays a marginal effect. Construction and transportation-related activities account for the bulk of the embodied emissions, accounting for more than 40% of the total embodied emissions from CAI. Policy implications for GHG mitigation within the CAI, such as developing secondary aluminum industry, improving energy mix and optimizing resource efficiency of production, are raised.

  1. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  2. Dietary Changes over Time in a Caiçara Community from the Brazilian Atlantic Forest

    Directory of Open Access Journals (Sweden)

    Priscila L. MacCord

    2006-12-01

    Full Text Available Because they are occurring at an accelerated pace, changes in the livelihoods of local coastal communities, including nutritional aspects, have been a subject of interest in human ecology. The aim of this study is to explore the dietary changes, particularly in the consumption of animal protein, that have taken place in Puruba Beach, a rural community of caiçaras on the São Paulo Coast, Brazil, over the 10-yr period from 1992-1993 to 2002-2003. Data were collected during six months in 1992-1993 and during the same months in 2002-2003 using the 24-hr recall method. We found an increasing dependence on external products in the most recent period, along with a reduction in fish consumption and in the number of fish species eaten. These changes, possibly associated with other nonmeasured factors such as overfishing and unplanned tourism, may cause food delocalization and a reduction in the use of natural resources. Although the consequences for conservation efforts in the Atlantic Forest and the survival of the caiçaras must still be evaluated, these local inhabitants may be finding a way to reconcile both the old and the new dietary patterns by keeping their houses in the community while looking for sources of income other than natural resources. The prospect shown here may reveal facets that can influence the maintenance of this and other communities undergoing similar processes by, for example, shedding some light on the ecological and economical processes that may occur within their environment and in turn affect the conservation of the resources upon which the local inhabitants depend.

  3. Computer-aided instruction system; Systeme d'enseignement programme par ordinateur

    Energy Technology Data Exchange (ETDEWEB)

    Teneze, Jean Claude

    1968-12-18

    This research thesis addresses the use of teleprocessing and time sharing by the RAX IBM system and the possibility to introduce a dialog with the machine to develop an application in which the computer plays the role of a teacher for different pupils at the same time. Two operating modes are thus exploited: a teacher-mode and a pupil-mode. The developed CAI (computer-aided instruction) system comprises a checker to check the course syntax in teacher-mode, a translator to trans-code the course written in teacher-mode into a form which can be processes by the execution programme, and the execution programme which presents the course in pupil-mode.

  4. Corporate Involvement in C AI

    Science.gov (United States)

    Baker, Justine C.

    1978-01-01

    Historic perspective of computer manufacturers and their contribution to CAI. Corporate CAI products and services are mentioned, as is a forecast for educational involvement by computer corporations. A chart of major computer corporations shows gross sales, net earnings, products and services offered, and other corporate information. (RAO)

  5. A New Type of Foreign Clast in A Polymict Ureilite: A CAI or AL-Rich Chondrule

    Science.gov (United States)

    Goodrich, C. A.; Ross, D. K.; Treiman, A. H.

    2017-01-01

    Introduction: Polymict ureilites are breccias interpreted to represent regolith formed on a ureilitic asteroid [1-3]. They consist of approximately 90-95% clasts of various ureilite types (olivine-pyroxene rocks with Fo 75-95), a few % indigenous feldspathic clasts, and a few % foreign clasts [4-20]. The foreign clasts are diverse, including fragments of H, L, LL and R chondrites, angrites, other achondrites, and dark clasts similar to CC [6,7,9-19]. We report a new type of foreign clast in polymict ureilite DaG 999. Methods: Clast 8 in Dar al Gani (DaG) 999/1 (Museum fur Naturkunde) was discovered during a survey of feldspathic clasts in polymict ureilites [19,20]. It was studied by BEI, EMPA, and X-ray mapping on the JEOL 8530F electron microprobe at ARES, JSC. Petrography and Mineral Compositions: Clast 8 is sub-rounded to irregular in shape, approximately 85 micrometers in diameter, and consists of approximately 68% pyroxene and 32% mesostasis (by area). Part of the pyroxene (top half of clast in Fig. 1a and 2) shows a coarse dendritic morphology; the rest appears massive. Mesostasis may be glassy and contains fine needles/grains of pyroxene. The pyroxene has very high CaO (23.5 wt.%) and Al2O3 (19.7 wt.%), with the formula: (Ca(0.91)Mg(0.63)Fe(0.01)Al(sup VI) (0.38)Cr(0.01)Ti(0.05)1.99 Si2O6. The bulk mesostasis also has very high Al2O3 (approximately 26 wt.%). A bulk composition for the clast was obtained by combining modal abundances with phase compositions (Table 1, Fig. 3). Discussion: The pyroxene in clast 8 has a Ca-Al-(Ti)- rich (fassaitic) composition that is clearly distinct from compositions of pyroxenes in main group ureilites [22] or indigenous feldspathic clasts in polymict ureilites [4-8]. It also has significantly higher Al than fassaite in angrites (up to approximately 12 wt.% [23]), which occur as xenoliths in polymict ureilites. Ca-Al-Ti rich pyroxenes are most commonly found in CAIs, Al-rich chondrules and other types of refractory

  6. Using dynamic software in mathematics: the case of reflection symmetry

    Science.gov (United States)

    Tatar, Enver; Akkaya, Adnan; Berrin Kağizmanli, Türkan

    2014-10-01

    This study was carried out to examine the effects of computer-assisted instruction (CAI) using dynamic software on the achievement of students in mathematics in the topic of reflection symmetry. The study also aimed to ascertain the pre-service mathematics teachers' opinions on the use of CAI in mathematics lessons. In the study, a mixed research method was used. The study group of this research consists of 30 pre-service mathematics teachers. The data collection tools used include a reflection knowledge test, a survey and observations. Based on the analysis of the data obtained from the study, the use of CAI had a positive effect on achievement in the topic of reflection symmetry of the pre-service mathematics teachers. The pre-service mathematics teachers were found to largely consider that a mathematics education which is carried out utilizing CAI will be more beneficial in terms of 'visualization', 'saving of time' and 'increasing interest/attention in the lesson'. In addition, it was found that the vast majority of them considered using computers in their teaching on the condition that the learning environment in which they would be operating has the appropriate technological equipment.

  7. Computer-Assisted Instruction in the N.W.T.

    Science.gov (United States)

    Garraway, Tom

    For the past seven years, the Division of Educational Research Services at the University of Alberta has been operating an IBM 1500 CAI system. This paper describes demonstration projects set up in anticipation of the establishment of remote CAI in the North West Territories. These include a moon landing simulation program; a diagnostic program in…

  8. The Effectiveness of a Computer-Assisted Instruction Package in Supplementing Teaching of Selected Concepts in High School Chemistry: Writing Formulas and Balancing Chemical Equations.

    Science.gov (United States)

    Wainwright, Camille L.

    Four classes of high school chemistry students (N=108) were randomly assigned to experimental and control groups to investigate the effectiveness of a computer assisted instruction (CAI) package during a unit on writing/naming of chemical formulas and balancing equations. Students in the experimental group received drill, review, and reinforcement…

  9. The Use of Computer-Assisted Instruction as an Instructional Tool to Teach Social Stories to Individuals Who Have Been Diagnosed on the Autism Spectrum

    Directory of Open Access Journals (Sweden)

    Nanette Edeiken-Cooperman

    2014-10-01

    Full Text Available This article discusses the use of computer-assisted instruction (CAI to teach Social Stories as a method of positively affecting the social understanding and behaviors of learners who have been diagnosed on the autism spectrum. As the diagnosis rate for those with ASD continues to rise along with the practice of including these learners in the general education environment, there is an increasing need to identify evidence-based practices that focus on the acquisition and remediation of social-communication skills, social skills, and social competence. A variety of strategies have been developed and implemented to remediate deficits in these areas. The use of CAI is an approach that has been proven to accommodate both the needs and the visual learning styles of these learners.

  10. Development of a learning-oriented computer assisted instruction designed to improve skills in the clinical assessment of the nutritional status: a pilot evaluation.

    Science.gov (United States)

    García de Diego, Laura; Cuervo, Marta; Martínez, J Alfredo

    2015-01-01

    Computer assisted instruction (CAI) is an effective tool for evaluating and training students and professionals. In this article we will present a learning-oriented CAI, which has been developed for students and health professionals to acquire and retain new knowledge through the practice. A two-phase pilot evaluation was conducted, involving 8 nutrition experts and 30 postgraduate students, respectively. In each training session, the software developed guides users in the integral evaluation of a patient's nutritional status and helps them to implement actions. The program includes into the format clinical tools, which can be used to recognize possible patient's needs, to improve the clinical reasoning and to develop professional skills. Among them are assessment questionnaires and evaluation criteria, cardiovascular risk charts, clinical guidelines and photographs of various diseases. This CAI is a complete software package easy to use and versatile, aimed at clinical specialists, medical staff, scientists, educators and clinical students, which can be used as a learning tool. This application constitutes an advanced method for students and health professionals to accomplish nutritional assessments combining theoretical and empirical issues, which can be implemented in their academic curriculum.

  11. A study of the effects of gender and different instructional media (computer-assisted instruction tutorials vs. textbook) on student attitudes and achievement in a team-taught integrated science class

    Science.gov (United States)

    Eardley, Julie Anne

    The purpose of this study was to determine the effect of different instructional media (computer assisted instruction (CAI) tutorial vs. traditional textbook) on student attitudes toward science and computers and achievement scores in a team-taught integrated science course, ENS 1001, "The Whole Earth Course," which was offered at Florida Institute of Technology during the Fall 2000 term. The effect of gender on student attitudes toward science and computers and achievement scores was also investigated. This study employed a randomized pretest-posttest control group experimental research design with a sample of 30 students (12 males and 18 females). Students had registered for weekly lab sessions that accompanied the course and had been randomly assigned to the treatment or control group. The treatment group used a CAI tutorial for completing homework assignments and the control group used the required textbook for completing homework assignments. The Attitude toward Science and Computers Questionnaire and Achievement Test were the two instruments administered during this study to measure students' attitudes and achievement score changes. A multivariate analysis of covariance (MANCOVA), using hierarchical multiple regression/correlation (MRC), was employed to determine: (1) treatment versus control group attitude and achievement differences; and (2) male versus female attitude and achievement differences. The differences between the treatment group's and control group's homework averages were determined by t test analyses. The overall MANCOVA model was found to be significant at p factor set independent variables separately resulted in gender being the only variable that significantly contributed in explaining the variability in a dependent variable, attitudes toward science and computers. T test analyses of the homework averages showed no significant differences. Contradictory to the findings of this study, anecdotal information from personal communication, course

  12. Revisiting cognitive and learning styles in computer-assisted instruction: not so useful after all.

    Science.gov (United States)

    Cook, David A

    2012-06-01

    In a previous systematic review, the author proposed that adaptation to learners' cognitive and learning styles (CLSs) could improve the efficiency of computer-assisted instruction (CAI). In the present article, he questions that proposition, arguing that CLSs do not make a substantive difference in CAI. To support this argument, the author performed an updated systematic literature search, pooled new findings with those from the previous review, and reinterpreted this evidence with a focus on aptitude-treatment interactions. (An aptitude-treatment interaction occurs when a student with attribute 1 learns better with instructional approach A than with approach B, whereas a student with attribute 2 learns better with instructional approach B).Of 65 analyses reported in 48 studies, only 9 analyses (14%) showed significant interactions between CLS and instructional approach. It seems that aptitude-treatment interactions with CLSs are at best infrequent and small in magnitude. There are several possible explanations for this lack of effect. First, the influence of strong instructional methods likely dominates the impact of CLSs. Second, current methods for assessing CLSs lack validity evidence and are inadequate to accurately characterize the individual learner. Third, theories are vague, and empiric evidence is virtually nonexistent to guide the planning of style-targeted instructional designs. Adaptation to learners' CLSs thus seems unlikely to enhance CAI. The author recommends that educators focus on employing strong instructional methods. Educators might also consider assessing and adapting to learners' prior knowledge or allowing learners to select among alternate instructional approaches.

  13. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  14. The Effects of Computer-Assisted Instruction Designed According to 7E Model of Constructivist Learning on Physics Student Teachers' Achievement, Concept Learning, Self-Efficacy Perceptions and Attitudes

    Science.gov (United States)

    Kocakaya, Serhat; Gonen, Selahattin

    2010-01-01

    The purpose of this study was to investigate the effects of a Computer-Assisted Instruction designed according to 7E model of constructivist learning(CAI7E) related to "electrostatic'' topic on physics student teachers' cognitive development, misconceptions, self-efficacy perceptions and attitudes. The study was conducted in 2006-2007…

  15. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  16. LISP教育用ICAIシステムの開発

    OpenAIRE

    伊藤, 寿勝; 島本, 肇; 黒島, 利一; 杉岡, 一郎

    1989-01-01

    Traditional computer-assisted instruction (CAI) has advantages and also has some of disadvantages that should be improved. To study intelligent CAI (ICAI) is regarded as trial to better CAI by using techniques in knowledge engineering. The system that was constructed in this research consists of three different modules ―a special knowledge module, a student model module and a guide rule module― to solve some problems of conventional CAI. For future researchers, the opportunity for engaging th...

  17. Effects of Computer-Assisted Instruction with Conceptual Change Texts on Removing the Misconceptions of Radioactivity

    Directory of Open Access Journals (Sweden)

    Ahmet YUMUŞAK

    2016-12-01

    Full Text Available Training young scientists, enabling conceptual understanding in science education is quite important. Misconception is one of the important indications for whether the concepts are understood or not. The most important educational tools to remove misconceptions are conceptual change texts. In addition, one of the important methods to remove misconceptions is computer-assisted instruction. The goal of this study is to research the effects of the use of computer-assisted instruction (CAI, conceptual change texts (CCT, computer-assisted instruction with conceptual change texts (CAI+CCT, and use of traditional teaching method (TTM on removing the misconceptions of science teacher candidates on the subject of radioactivity. Research sample was made of totally 92 students studying at four different groups of senior students in Celal Bayar University, Faculty of Education, Department of Science Education in 2011-2012 academic year. A different teaching method was used in each group. Experimental groups were randomly determined; in the first experimental group, computer-assisted instruction was used (23 students; in the second experimental group, conceptual change texts were used (23 students; in the third experimental group, computer-assisted instruction with conceptual change texts were used (23 students; and the fourth group, on which traditional education method was used, was called control group (23 students. Two-tier misconception diagnostic instrument, which was developed by the researcher, was used as data collection tool of the research. “Nonequivalent Control Groups Experimental Design” was used in this research in order to determine the efficiency of different teaching methods. Obtained data were analyzed by using SPSS 21.0. As a result of the research, it was determined that methods used on experimental groups were more successful than traditional teaching method practiced on control group in terms of removing misconceptions on

  18. Comparison of computer-assisted instruction (CAI) versus traditional textbook methods for training in abdominal examination (Japanese experience).

    Science.gov (United States)

    Qayumi, A K; Kurihara, Y; Imai, M; Pachev, G; Seo, H; Hoshino, Y; Cheifetz, R; Matsuura, K; Momoi, M; Saleem, M; Lara-Guerra, H; Miki, Y; Kariya, Y

    2004-10-01

    This study aimed to compare the effects of computer-assisted, text-based and computer-and-text learning conditions on the performances of 3 groups of medical students in the pre-clinical years of their programme, taking into account their academic achievement to date. A fourth group of students served as a control (no-study) group. Participants were recruited from the pre-clinical years of the training programmes in 2 medical schools in Japan, Jichi Medical School near Tokyo and Kochi Medical School near Osaka. Participants were randomly assigned to 4 learning conditions and tested before and after the study on their knowledge of and skill in performing an abdominal examination, in a multiple-choice test and an objective structured clinical examination (OSCE), respectively. Information about performance in the programme was collected from school records and students were classified as average, good or excellent. Student and faculty evaluations of their experience in the study were explored by means of a short evaluation survey. Compared to the control group, all 3 study groups exhibited significant gains in performance on knowledge and performance measures. For the knowledge measure, the gains of the computer-assisted and computer-assisted plus text-based learning groups were significantly greater than the gains of the text-based learning group. The performances of the 3 groups did not differ on the OSCE measure. Analyses of gains by performance level revealed that high achieving students' learning was independent of study method. Lower achieving students performed better after using computer-based learning methods. The results suggest that computer-assisted learning methods will be of greater help to students who do not find the traditional methods effective. Explorations of the factors behind this are a matter for future research.

  19. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  20. Sequential nearest-neighbor effects on computed {sup 13}C{sup {alpha}} chemical shifts

    Energy Technology Data Exchange (ETDEWEB)

    Vila, Jorge A. [Cornell University, Baker Laboratory of Chemistry and Chemical Biology (United States); Serrano, Pedro; Wuethrich, Kurt [The Scripps Research Institute, Department of Molecular Biology (United States); Scheraga, Harold A., E-mail: has5@cornell.ed [Cornell University, Baker Laboratory of Chemistry and Chemical Biology (United States)

    2010-09-15

    To evaluate sequential nearest-neighbor effects on quantum-chemical calculations of {sup 13}C{sup {alpha}} chemical shifts, we selected the structure of the nucleic acid binding (NAB) protein from the SARS coronavirus determined by NMR in solution (PDB id 2K87). NAB is a 116-residue {alpha}/{beta} protein, which contains 9 prolines and has 50% of its residues located in loops and turns. Overall, the results presented here show that sizeable nearest-neighbor effects are seen only for residues preceding proline, where Pro introduces an overestimation, on average, of 1.73 ppm in the computed {sup 13}C{sup {alpha}} chemical shifts. A new ensemble of 20 conformers representing the NMR structure of the NAB, which was calculated with an input containing backbone torsion angle constraints derived from the theoretical {sup 13}C{sup {alpha}} chemical shifts as supplementary data to the NOE distance constraints, exhibits very similar topology and comparable agreement with the NOE constraints as the published NMR structure. However, the two structures differ in the patterns of differences between observed and computed {sup 13}C{sup {alpha}} chemical shifts, {Delta}{sub ca,i}, for the individual residues along the sequence. This indicates that the {Delta}{sub ca,i} -values for the NAB protein are primarily a consequence of the limited sampling by the bundles of 20 conformers used, as in common practice, to represent the two NMR structures, rather than of local flaws in the structures.

  1. 《采薇》的叙事伦理分析%Narrative Ethics of Cai Wei

    Institute of Scientific and Technical Information of China (English)

    王海燕

    2014-01-01

    The controversial dilemma of how to understand Boyi, Shuqi in Luxun’ s novel Cai Wei, it is not a realistic ethical problem, which should be based on the novel's narrative ethics to answer. Narrative ethics refers to various forms of arrangement of ethical dimension revealed. Novel manifests sympathy ethics to Boyi, Shu Qi through the narrative angle choice and narrative distance controlling. Also, by the counterpoint to character, event and ironic expression in structure, novel conveys the ethics excluding to all sorts of characters from emperor Zhouwu to A Jin. Compared to other “Generational change” theme novels of Luxun, the complication of narrative ethics of Cai Wei not only is the projection of reality, but also reflects the further conscious to narrative art of Luxun.%《采薇》中颇有争议的“如何理解伯夷、叔齐”这一难点,并不是一个现实伦理问题,而应依据小说的叙事伦理来解答。叙事伦理即各种形式安排透露出的伦理维度。小说通过叙事角度的选择与叙事距离的控制体现出对伯夷、叔齐的伦理同情,而由结构上的人物对位与事件对位及反讽表达的是对自周武王至阿金诸色人物的伦理拒斥。与前期同是“易代”主题的小说相比,《采薇》叙事伦理的复杂化既是现实的投射,也体现了鲁迅对于小说叙事艺术的进一步自觉。

  2. Effective Methods for Teaching Information Literacy Skills to Undergraduate Students: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Denise Koufogiannakis

    2006-09-01

    Full Text Available Objective The objective of this systematic review was to assess which library instruction methods are most effective for improving the information skills of students at an introductory, undergraduate level, using cognitive outcomes (measuring changes in knowledge. The study sought to address the following questions: 1 What is the overall state of research on this topic? 2 Which teaching methods are more effective? Methods This project utilised systematic review methodology. Researchers searched fifteen databases and retrieved 4,356 potentially relevant citations. They reviewed the titles and abstracts for relevance, and of those, 257 complete articles were considered in-depth using a predetermined inclusion/exclusion form. There were 122 unique studies that met the inclusion criteria and were subjected to an extensive data extraction and critical appraisal process. Of these studies, 55 met author‐defined quality criteria to provide information on the effectiveness of different teaching methods. From this review there was a final group of 16 studies with sufficient information to enable meta-analyses and calculations of standardized mean differences. Results The overwhelming majority of studies were conducted in the United States (88%. Experimental or quasi-experimental research methods were used in 79 studies (65%. Teaching methods used in the studies varied, with the majority focused on traditional methods of teaching, followed by computer assisted instruction (CAI, and self‐directed independent learning (SDIL. Studies measured outcomes that correlated with Bloom’s lower levels of learning (‘Remember’, ‘Understand’, ‘Apply’. Sixteen studies compared traditional instruction (TI with no instruction, and twelve of those found a positive outcome. Meta-analysis of the data from 4 of these studies agreed with the positive conclusions favouring TI. Fourteen studies compared CAI with traditional instruction (TI, and 9 of these showed

  3. Selected Topics on Advanced Information Systems Engineering: Editorial Introduction to the Issue 5 of CSIMQ

    Directory of Open Access Journals (Sweden)

    Janis Grabis

    2015-12-01

    Full Text Available The 5th issue of the journal on Complex Systems Informatics and Modeling (CSIMQ presents extended versions of five papers selected from the CAiSE Forum 2015. The forum was part of the 27th edition of international Conference on Advanced Information Systems engineering (CAiSE 2015, which took place in June 2015 in Stockholm, Sweden. Information systems engineering draws its foundation from various interrelated disciplines including, e.g., conceptual modeling, database systems, business process management, requirements engineering, human computer interaction, and enterprise computing to address various practical challenges in development and application of information systems. The guiding subjects of CAiSE 2015 were Creativity, Ability, and Integrity. The CAiSE Forum aimed at presenting and discussing new ideas and tools related to information systems Engineering.

  4. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  5. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  6. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  7. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  8. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  9. Crystal structures of coordination polymers from CaI2 and proline

    Directory of Open Access Journals (Sweden)

    Kevin Lamberts

    2015-06-01

    Full Text Available Completing our reports concerning the reaction products from calcium halides and the amino acid proline, two different solids were found for the reaction of l- and dl-proline with CaI2. The enantiopure amino acid yields the one-dimensional coordination polymer catena-poly[[aqua-μ3-l-proline-tetra-μ2-l-proline-dicalcium] tetraiodide 1.7-hydrate], {[Ca2(C5H9NO25(H2O]I4·1.7H2O}n, (1, with two independent Ca2+ cations in characteristic seven- and eightfold coordination. Five symmetry-independent zwitterionic l-proline molecules bridge the metal sites into a cationic polymer. Racemic proline forms with Ca2+ cations heterochiral chains of the one-dimensional polymer catena-poly[[diaquadi-μ2-dl-proline-calcium] diiodide], {[Ca(C5H9NO22(H2O2]I2}n, (2. The centrosymmetric structure is built by one Ca2+ cation that is bridged towards its symmetry equivalents by two zwitterionic proline molecules. In both structures, the iodide ions remain non-coordinating and hydrogen bonds are formed between these counter-anions, the amino groups, coordinating and co-crystallized water molecules. While the overall composition of (1 and (2 is in line with other structures from calcium halides and amino acids, the diversity of the carboxylate coordination geometry is quite surprising.

  10. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  11. The possibility of controlled auto-ignition (CAI) in gasoline engine and gas to liquid (GTL) as a fuel of diesel engine in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, D. [Korea Inst. of Machinery and Materials, Daejou (Korea)

    2005-07-01

    A significant challenge grows from the ever-increasing demands for the optimization of performance, emissions, fuel economy and drivability. The most powerful technologies in the near future to improve these factors are believed Controlled Auto-Ignition (CAI) in gasoline engine and Gas to Liquid (GTL) as a fuel of Diesel engine. In recent years there has been an increasing trend to use more complex valvetrain designs from traditional camshaft driven mechanical systems to camless electromagnetic or electrohydraulic solutions. Comparing to fixed valve actuation systems, variable valve actuation (VVA) should be powerful to optimize the engine cycle. The matching of valve events to the engine performance and to emission requirements at a given engine or vehicle operating condition can be further optimized to the Controlled Auto-Ignition (CAI) in gasoline engine, which has benefits in NOx emission, fuel consumption, combustion stability and intake throttle load. In case of Diesel engine, the increasing demands for NOx and soot emission reduction have introduced aftertreatment technologies recently, but been in need of basic solution for the future, such as a super clean fuel like Gas to Liquid (GTL), which has benefits in comparability to diesel fuel, independency from crude oil and reduction of CO, THC and soot emissions. Korea looks to the future with these kinds of technologies, and tries to find the possibility for reaching the future targets in the internal combustion engine. (orig.)

  12. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  13. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  14. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  15. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  16. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van

  17. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    1990-04-01

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  18. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  19. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  20. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  1. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  2. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  3. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  4. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  5. Personal optical disk library (PODL) for knowledge engineering

    Science.gov (United States)

    Wang, Hong; Jia, Huibo; Xu, Duanyi

    2001-02-01

    This paper describes the structure of Personal Optical Disk Library (PODL), a kind of large capacity (40 GB) optical storage equipment for personal usage. With the knowledge engineering technology integrated in the PODL, it can be used on knowledge query, knowledge discovery, Computer-Aided Instruction (CAI) and Online Analysis Process (OLAP).

  6. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  7. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  8. A companion agent for automated training systems

    NARCIS (Netherlands)

    Buiël, E.F.T.; Lubbers, J.

    2007-01-01

    TNO Defence, Security & Safety has a long history of applied research in the area of automated simulator-based training by means of Computer-Assisted Instruction (CAI). Traditionally, a CAI system does not enable a true dialogue between the learner and the virtual instructor. Most frequently, the

  9. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  10. Bird community structure in riparian environments in Cai River, Rio Grande do Sul, Brazil

    Directory of Open Access Journals (Sweden)

    Jaqueline Brummelhaus

    2012-06-01

    Full Text Available Urbanization produces changes in riparian environments, causing effects in the structure of bird communities, which present different responses to the impacts. We compare species richness, abundance, and composition of birds in riparian environments with different characteristics in Cai River, Rio Grande do Sul, Brazil. We carried out observations in woodland, grassland, and urban environments, between September 2007 and August 2008. We listed 130 bird species, 29 species unique to woodland environment, and an endangeredspecies: Triclaria malachitacea. Bird abundance differed from woodland (n = 426 individuals to urban environments (n = 939 individuals (F2,6 = 7.315; P = 0.025. Species composition and feeding guilds differed significantly in the bird community structures among these three riparian environments. In the grassland and urban environments there were more generalist insectivorous species, while in the woodland environments we find more leaf and trunk insectivorous species and frugivorous species, sensitive to human impacts. Bird species can be biological quality indicators and they contribute to ecosystems performing relevant functions. With the knowledge on bird community structure and their needs, it is possible to implement management practices for restoration of degraded riparian environments.

  11. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  12. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  13. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  14. Adaptive Machine Aids to Learning.

    Science.gov (United States)

    Starkweather, John A.

    With emphasis on man-machine relationships and on machine evolution, computer-assisted instruction (CAI) is examined in this paper. The discussion includes the background of machine assistance to learning, the current status of CAI, directions of development, the development of criteria for successful instruction, meeting the needs of users,…

  15. Computational analysis of a multistage axial compressor

    Science.gov (United States)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  16. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  17. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  18. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  19. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  20. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  1. Keeping the Focus on Underserved Students, Privilege, and Power: A Reaction to Clements and Sarama

    Science.gov (United States)

    Kitchen, Richard; Berk, Sarabeth

    2017-01-01

    In our response to Clements and Sarama (2017), we address the 5 issues that they identify as criticisms of our Research Commentary (Kitchen & Berk, 2016). As in our original commentary, we highlight concerns we have regarding the delivery of [computer-assisted instruction] CAI programs and potential misuses of CAI, particularly at Title I…

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  4. Computer aided safety analysis

    International Nuclear Information System (INIS)

    1988-05-01

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  5. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  6. Perceptions on hospitality when visiting secluded communities of guaranis, caiçaras e quilombolas in Paraty region

    Directory of Open Access Journals (Sweden)

    Luis Alberto Beares

    2008-10-01

    Full Text Available Tourism in secluded communities puts different cultures in contact with each other and must be handled carefully not to cause environmental damage as well as cultural loss which might jeopardize the local development and create hostile relationships. The proposal of in sito tourism, considering the local memory and patrimony as a hospitality potential, was observed during technical visitations to three communities located in the Paraty region and surroundings: Guarani, Caiçara (fishermen and Quilombola(African slaves descendants. Through field work involving visitations to communities and interviews with locals, information regarding cultural differences and the importance of the land occupation in the history of each of the communities was assessed. The common link in the history of these peoples is the struggle for the right of land possession. During visits when people shared their territory various forms of hospitality in each community were verified, issued from different cultures and cultural values.

  7. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  8. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  9. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  10. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  11. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    Yoneyama, Mitsuru; Masuda, Takahiro; Kubota, Ryuji; Fujiwara, Tadashi; Sakuma, Hitoshi

    2000-01-01

    Under severe accident conditions of a nuclear power plant, plant operators and technical support center (TSC) staffs will be under a amount of stress. Therefore, those individuals responsible for managing the plant should promote their understanding about the accident management and operations. Moreover, it is also important to train in ordinary times, so that they can carry out accident management operations effectively on severe accidents. Therefore, the education and training system which works on personal computers was developed by Japanese BWR group (Tokyo Electric Power Co.,Inc., Tohoku Electric Power Co. ,Inc., Chubu Electric Power Co. ,Inc., Hokuriku Electric Power Co.,Inc., Chugoku Electric Power Co.,Inc., Japan Atomic Power Co.,Inc.), and Hitachi, Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the other is education and training system with a computer simulation. Both systems are designed to execute on MS-Windows(R) platform of personal computers. These systems provide plant operators and technical support center staffs with an effective education and training tool for accident management. TEPCO used the simulation system for the emergency exercise assuming the occurrence of hypothetical severe accident, and have performed an effective exercise in March, 2000. (author)

  12. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Recovery From a First-Time Lateral Ankle Sprain and the Predictors of Chronic Ankle Instability: A Prospective Cohort Analysis.

    Science.gov (United States)

    Doherty, Cailbhe; Bleakley, Chris; Hertel, Jay; Caulfield, Brian; Ryan, John; Delahunt, Eamonn

    2016-04-01

    Impairments in motor control may predicate the paradigm of chronic ankle instability (CAI) that can develop in the year after an acute lateral ankle sprain (LAS) injury. No prospective analysis is currently available identifying the mechanisms by which these impairments develop and contribute to long-term outcome after LAS. To identify the motor control deficits predicating CAI outcome after a first-time LAS injury. Cohort study (diagnosis); Level of evidence, 2. Eighty-two individuals were recruited after sustaining a first-time LAS injury. Several biomechanical analyses were performed for these individuals, who completed 5 movement tasks at 3 time points: (1) 2 weeks, (2) 6 months, and (3) 12 months after LAS occurrence. A logistic regression analysis of several "salient" biomechanical parameters identified from the movement tasks, in addition to scores from the Cumberland Ankle Instability Tool and the Foot and Ankle Ability Measure (FAAM) recorded at the 2-week and 6-month time points, were used as predictors of 12-month outcome. At the 2-week time point, an inability to complete 2 of the movement tasks (a single-leg drop landing and a drop vertical jump) was predictive of CAI outcome and correctly classified 67.6% of cases (sensitivity, 83%; specificity, 55%; P = .004). At the 6-month time point, several deficits exhibited by the CAI group during 1 of the movement tasks (reach distances and sagittal plane joint positions at the hip, knee and ankle during the posterior reach directions of the Star Excursion Balance Test) and their scores on the activities of daily living subscale of the FAAM were predictive of outcome and correctly classified 84.8% of cases (sensitivity, 75%; specificity, 91%; P < .001). An inability to complete jumping and landing tasks within 2 weeks of a first-time LAS and poorer dynamic postural control and lower self-reported function 6 months after a first-time LAS were predictive of eventual CAI outcome. © 2016 The Author(s).

  14. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  15. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  16. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  17. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  18. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  19. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  20. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  1. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  2. [The application of new technologies to solving maths problems for students with learning disabilities: the 'underwater school'].

    Science.gov (United States)

    Miranda-Casas, A; Marco-Taverner, R; Soriano-Ferrer, M; Melià de Alba, A; Simó-Casañ, P

    2008-01-01

    Different procedures have demonstrated efficacy to teach cognitive and metacognitive strategies to problem solving in mathematics. Some studies have used computer-based problem solving instructional programs. To analyze in students with learning disabilities the efficacy of a cognitive strategies training for problem solving, with three instructional delivery formats: a teacher-directed program (T-D), a computer-assisted instructional (CAI) program, and a combined program (T-D + CAI). Forty-four children with mathematics learning disabilities, between 8 and 10 years old participated in this study. The children were randomly assigned to one of the three instructional formats and a control group without cognitive strategies training. In the three instructional conditions which were compared all the students learnt problems solving linguistic and visual cognitive strategies trough the self-instructional procedure. Several types of measurements were used for analysing the possible differential efficacy of the three instructional methods implemented: solving problems tests, marks in mathematics, internal achievement responsibility scale, and school behaviours teacher ratings. Our findings show that the T-D training group and the T-D + CAI group improved significantly on math word problem solving and on marks in Maths from pre- to post-testing. In addition, the results indicated that the students of the T-D + CAI group solved more real-life problems and developed more internal attributions compared to both control and CAI groups. Finally, with regard to school behaviours, improvements in school adjustment and learning problems were observed in the students of the group with a combined instructional format (T-D + CAI).

  3. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  4. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  5. Improve Outcomes Study subjects Chemistry Teaching and Learning Strategies through independent study with the help of computer-based media

    Science.gov (United States)

    Sugiharti, Gulmah

    2018-03-01

    This study aims to see the improvement of student learning outcomes by independent learning using computer-based learning media in the course of STBM (Teaching and Learning Strategy) Chemistry. Population in this research all student of class of 2014 which take subject STBM Chemistry as many as 4 class. While the sample is taken by purposive as many as 2 classes, each 32 students, as control class and expriment class. The instrument used is the test of learning outcomes in the form of multiple choice with the number of questions as many as 20 questions that have been declared valid, and reliable. Data analysis techniques used one-sided t test and improved learning outcomes using a normalized gain test. Based on the learning result data, the average of normalized gain values for the experimental class is 0,530 and for the control class is 0,224. The result of the experimental student learning result is 53% and the control class is 22,4%. Hypothesis testing results obtained t count> ttable is 9.02> 1.6723 at the level of significance α = 0.05 and db = 58. This means that the acceptance of Ha is the use of computer-based learning media (CAI Computer) can improve student learning outcomes in the course Learning Teaching Strategy (STBM) Chemistry academic year 2017/2018.

  6. Computer analysis and comparison of chess players' game-playing styles

    OpenAIRE

    Krevs, Urša

    2015-01-01

    Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...

  7. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  8. The First Expert CAI System

    Science.gov (United States)

    Feurzeig, Wallace

    1984-01-01

    The first expert instructional system, the Socratic System, was developed in 1964. One of the earliest applications of this system was in the area of differential diagnosis in clinical medicine. The power of the underlying instructional paradigm was demonstrated and the potential of the approach for valuably supplementing medical instruction was recognized. Twenty years later, despite further educationally significant advances in expert systems technology and enormous reductions in the cost of computers, expert instructional methods have found very little application in medical schools.

  9. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  10. Just enough, but not too much interactivity leads to better clinical skills performance after a computer assisted learning module.

    Science.gov (United States)

    Kalet, A L; Song, H S; Sarpel, U; Schwartz, R; Brenner, J; Ark, T K; Plass, J

    2012-01-01

    Well-designed computer-assisted instruction (CAI) can potentially transform medical education. Yet little is known about whether specific design features such as direct manipulation of the content yield meaningful gains in clinical learning. We designed three versions of a multimedia module on the abdominal exam incorporating different types of interactivity. As part of their physical diagnosis course, 162 second-year medical students were randomly assigned (1:1:1) to Watch, Click or Drag versions of the abdominal exam module. First, students' prior knowledge, spatial ability, and prior experience with abdominal exams were assessed. After using the module, students took a posttest; demonstrated the abdominal exam on a standardized patient; and wrote structured notes of their findings. Data from 143 students were analyzed. Baseline measures showed no differences among groups regarding prior knowledge, experience, or spatial ability. Overall there was no difference in knowledge across groups. However, physical exam scores were significantly higher for students in the Click group. A mid-range level of behavioral interactivity was associated with small to moderate improvements in performance of clinical skills. These improvements were likely mediated by enhanced engagement with the material, within the bounds of learners' cognitive capacity. These findings have implications for the design of CAI materials to teach procedural skills.

  11. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  12. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  13. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  14. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  15. Temporal fringe pattern analysis with parallel computing

    International Nuclear Information System (INIS)

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-01-01

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis

  16. Genome-Wide Analysis of the Synonymous Codon Usage Patterns in Riemerella anatipestifer

    Directory of Open Access Journals (Sweden)

    Jibin Liu

    2016-08-01

    Full Text Available Riemerella anatipestifer (RA belongs to the Flavobacteriaceae family and can cause a septicemia disease in poultry. The synonymous codon usage patterns of bacteria reflect a series of evolutionary changes that enable bacteria to improve tolerance of the various environments. We detailed the codon usage patterns of RA isolates from the available 12 sequenced genomes by multiple codon and statistical analysis. Nucleotide compositions and relative synonymous codon usage (RSCU analysis revealed that A or U ending codons are predominant in RA. Neutrality analysis found no significant correlation between GC12 and GC3 (p > 0.05. Correspondence analysis and ENc-plot results showed that natural selection dominated over mutation in the codon usage bias. The tree of cluster analysis based on RSCU was concordant with dendrogram based on genomic BLAST by neighbor-joining method. By comparative analysis, about 50 highly expressed genes that were orthologs across all 12 strains were found in the top 5% of high CAI value. Based on these CAI values, we infer that RA contains a number of predicted highly expressed coding sequences, involved in transcriptional regulation and metabolism, reflecting their requirement for dealing with diverse environmental conditions. These results provide some useful information on the mechanisms that contribute to codon usage bias and evolution of RA.

  17. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  18. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  19. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  20. An Accurate liver segmentation method using parallel computing algorithm

    International Nuclear Information System (INIS)

    Elbasher, Eiman Mohammed Khalied

    2014-12-01

    Computed Tomography (CT or CAT scan) is a noninvasive diagnostic imaging procedure that uses a combination of X-rays and computer technology to produce horizontal, or axial, images (often called slices) of the body. A CT scan shows detailed images of any part of the body, including the bones muscles, fat and organs CT scans are more detailed than standard x-rays. CT scans may be done with or without "contrast Contrast refers to a substance taken by mouth and/ or injected into an intravenous (IV) line that causes the particular organ or tissue under study to be seen more clearly. CT scan of the liver and biliary tract are used in the diagnosis of many diseases in the abdomen structures, particularly when another type of examination, such as X-rays, physical examination, and ultra sound is not conclusive. Unfortunately, the presence of noise and artifact in the edges and fine details in the CT images limit the contrast resolution and make diagnostic procedure more difficult. This experimental study was conducted at the College of Medical Radiological Science, Sudan University of Science and Technology and Fidel Specialist Hospital. The sample of study was included 50 patients. The main objective of this research was to study an accurate liver segmentation method using a parallel computing algorithm, and to segment liver and adjacent organs using image processing technique. The main technique of segmentation used in this study was watershed transform. The scope of image processing and analysis applied to medical application is to improve the quality of the acquired image and extract quantitative information from medical image data in an efficient and accurate way. The results of this technique agreed wit the results of Jarritt et al, (2010), Kratchwil et al, (2010), Jover et al, (2011), Yomamoto et al, (1996), Cai et al (1999), Saudha and Jayashree (2010) who used different segmentation filtering based on the methods of enhancing the computed tomography images. Anther

  1. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  2. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  3. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  4. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  5. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  6. A computational methodology for formulating gasoline surrogate fuels with accurate physical and chemical kinetic properties

    KAUST Repository

    Ahmed, Ahfaz

    2015-03-01

    Gasoline is the most widely used fuel for light duty automobile transportation, but its molecular complexity makes it intractable to experimentally and computationally study the fundamental combustion properties. Therefore, surrogate fuels with a simpler molecular composition that represent real fuel behavior in one or more aspects are needed to enable repeatable experimental and computational combustion investigations. This study presents a novel computational methodology for formulating surrogates for FACE (fuels for advanced combustion engines) gasolines A and C by combining regression modeling with physical and chemical kinetics simulations. The computational methodology integrates simulation tools executed across different software platforms. Initially, the palette of surrogate species and carbon types for the target fuels were determined from a detailed hydrocarbon analysis (DHA). A regression algorithm implemented in MATLAB was linked to REFPROP for simulation of distillation curves and calculation of physical properties of surrogate compositions. The MATLAB code generates a surrogate composition at each iteration, which is then used to automatically generate CHEMKIN input files that are submitted to homogeneous batch reactor simulations for prediction of research octane number (RON). The regression algorithm determines the optimal surrogate composition to match the fuel properties of FACE A and C gasoline, specifically hydrogen/carbon (H/C) ratio, density, distillation characteristics, carbon types, and RON. The optimal surrogate fuel compositions obtained using the present computational approach was compared to the real fuel properties, as well as with surrogate compositions available in the literature. Experiments were conducted within a Cooperative Fuels Research (CFR) engine operating under controlled autoignition (CAI) mode to compare the formulated surrogates against the real fuels. Carbon monoxide measurements indicated that the proposed surrogates

  7. Natural gas diffusion model and diffusion computation in well Cai25 Bashan Group oil and gas reservoir

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Natural gas diffusion through the cap rock is mainly by means ofdissolving in water, so its concentration can be replaced by solubility, which varies with temperature, pressure and salinity in strata. Under certain geological conditions the maximal solubility is definite, so the diffusion com-putation can be handled approximately by stable state equation. Furthermore, on the basis of the restoration of the paleo-buried history, the diffusion is calculated with the dynamic method, and the result is very close to the real diffusion value in the geological history.

  8. From computer-assisted intervention research to clinical impact: The need for a holistic approach.

    Science.gov (United States)

    Ourselin, Sébastien; Emberton, Mark; Vercauteren, Tom

    2016-10-01

    The early days of the field of medical image computing (MIC) and computer-assisted intervention (CAI), when publishing a strong self-contained methodological algorithm was enough to produce impact, are over. As a community, we now have substantial responsibility to translate our scientific progresses into improved patient care. In the field of computer-assisted interventions, the emphasis is also shifting from the mere use of well-known established imaging modalities and position trackers to the design and combination of innovative sensing, elaborate computational models and fine-grained clinical workflow analysis to create devices with unprecedented capabilities. The barriers to translating such devices in the complex and understandably heavily regulated surgical and interventional environment can seem daunting. Whether we leave the translation task mostly to our industrial partners or welcome, as researchers, an important share of it is up to us. We argue that embracing the complexity of surgical and interventional sciences is mandatory to the evolution of the field. Being able to do so requires large-scale infrastructure and a critical mass of expertise that very few research centres have. In this paper, we emphasise the need for a holistic approach to computer-assisted interventions where clinical, scientific, engineering and regulatory expertise are combined as a means of moving towards clinical impact. To ensure that the breadth of infrastructure and expertise required for translational computer-assisted intervention research does not lead to a situation where the field advances only thanks to a handful of exceptionally large research centres, we also advocate that solutions need to be designed to lower the barriers to entry. Inspired by fields such as particle physics and astronomy, we claim that centralised very large innovation centres with state of the art technology and health technology assessment capabilities backed by core support staff and open

  9. A new approach of the Star Excursion Balance Test to assess dynamic postural control in people complaining from chronic ankle instability.

    Science.gov (United States)

    Pionnier, Raphaël; Découfour, Nicolas; Barbier, Franck; Popineau, Christophe; Simoneau-Buessinger, Emilie

    2016-03-01

    The purpose of this study was to quantitatively and qualitatively assess dynamic balance with accuracy in individuals with chronic ankle instability (CAI). To this aim, a motion capture system was used while participants performed the Star Excursion Balance Test (SEBT). Reached distances for the 8 points of the star were automatically computed, thereby excluding any dependence to the experimenter. In addition, new relevant variables were also computed, such as absolute time needed to reach each distance, lower limb ranges of motion during unipodal stance, as well as absolute error of pointing. Velocity of the center of pressure and range of variation of ground reaction forces have also been assessed during the unipodal phase of the SEBT thanks to force plates. CAI group exhibited smaller reached distances and greater absolute error of pointing than the control group (p<0.05). Moreover, the ranges of motion of lower limbs joints, the velocity of the center of pressure and the range of variation of the ground reaction forces were all significantly smaller in the CAI group (p<0.05). These reduced quantitative and qualitative performances highlighted a lower dynamic postural control. The limited body movements and accelerations during the unipodal stance in the CAI group could highlight a protective strategy. The present findings could help clinicians to better understand the motor strategies used by CAI patients during dynamic balance and may guide the rehabilitation process. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  11. Identification of causes of human errors in support of the development of intelligent computer-assisted instruction systems for plant operator training

    International Nuclear Information System (INIS)

    Furuhama, Yutaka; Furuta, Kazuo; Kondo, Shunsuke

    1995-01-01

    This paper proposes a methodology to identify causes of human error in the operation of plant systems to support the development of CAI system for operator training. The target task of this methodology is goal-driven and knowledge-based planning behaviour, the cognitive process of which is assumed to be modeled as means-end analysis. The methodology uses four criteria to classify errors in an operation into eight groups, and then asks the trainee several questions to prune the causes. To confirm the usefulness of this methodology, a prototype CAI system is developed for the operation of filling up sodium into the primary coolant system of a liquid-metal-cooled fast reactor. The experimental result indicates that the system has the capability of identifying causes of the trainee's error, and consequently of figuring out the characteristics of his/her defect. As a result of this study, several issues are identified for future research

  12. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  13. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  14. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  15. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  16. New Petrology, Mineral Chemistry and Stable MG Isotope Compositions of an Allende CAI: EK-459-7-2

    Science.gov (United States)

    Jeffcoat, C. R.; Kerekgyarto, A. G.; Lapen, T. J.; Righter, M.; Simon, J. I.; Ross, D. K.

    2016-01-01

    Calcium-aluminum-rich inclusions (CAIs) are the key to understanding physical and chemical conditions in the nascent solar nebula. These inclusions have the oldest radiometric ages of solar system materials and are composed of phases that are predicted to condense early from a gas of solar composition. Thus, their chemistry and textures record conditions and processes in the earliest stages of development of the solar nebula. Type B inclusions are typically larger and more coarse grained than other types with substantial evidence that many of them were at least partially molten. Type B inclusions are further subdivided into Type B1 (possess thick melilite mantle) and Type B2 (lack melilite mantle). Despite being extensively studied, the origin of the melilite mantles of Type B1 inclusions remains uncertain. We present petrologic and chemical data for a Type B inclusion, EK-459-7-2, that bears features found in both Type B1 and B2 inclusions and likely represents an intermediate between the two types. Detailed studies of more of these intermediate objects may help to constrain models for Type B1 rim formation.

  17. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  18. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  19. Self-regulation by industry of food marketing is having little impact during children's preferred television.

    Science.gov (United States)

    Potvin Kent, Monique; Dubois, Lise; Wanless, Alissa

    2011-10-01

    To examine the efficacy of self-regulation of food marketing to children by comparing, during children's preferred viewing on television, the differences in food/beverage marketing between two groups of corporations: 17 corporations participating in the Canadian Children's Food and Beverage Advertising Initiative (CAI) and 35 corporations not participating (non-CAI) in this initiative. The food/beverage marketing activities of CAI and non-CAI corporations during 99.5 hours of children's preferred viewing on television were compared. First, the preferred television viewing of 272 children aged 10-12 years from Ontario and Quebec who completed TV viewing journals for a seven-day period was determined. A total of 32 television stations were simultaneously recorded, and a content analysis of children's preferred viewing was conducted and included coding all food/beverage promotions and their nutritional content. Each food/beverage promotion was classified by corporation type (i.e., CAI or non-CAI). The CAI was responsible for significantly more food/beverage promotions, and used media characters and repetition more frequently in their food/beverage promotions than the non-CAI group. Nutritionally, the CAI food/beverage promotions were higher in fats, sugar, sodium and energy per 100 grams. A significantly greater proportion of the CAI food/beverage promotions were considered 'less healthy' compared to the non-CAI promotions. With the exception of the four corporations that did not market to children at all, the commitments that have been made in the CAI are not having a significant impact on the food and beverage marketing environment on television which is viewed by 10-12-year-olds.

  20. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  1. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  2. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  3. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  4. Lessons Learned from the ECML/PKDD Discovery Challenge on the Atherosclerosis Risk Factors Data

    Czech Academy of Sciences Publication Activity Database

    Berka, Petr; Rauch, Jan; Tomečková, Marie

    2007-01-01

    Roč. 26, č. 3 (2007), s. 329-344 ISSN 1335-9150 R&D Projects: GA MŠk(CZ) 1M06014; GA ČR GA201/05/0325 Grant - others:GA VŠE(CZ) 25/05 Institutional research plan: CEZ:AV0Z10300504 Keywords : atherosclerosis risk * data mining * discovery challenge Subject RIV: IN - Informatics, Computer Science Impact factor: 0.349, year: 2007 http://www.cai.sk/ojs/index.php/cai/article/view/313

  5. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  6. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  7. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  8. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  9. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  10. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  11. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  12. Computer-assisted education system for arrhythmia (CAESAR).

    Science.gov (United States)

    Fukushima, M; Inoue, M; Fukunami, M; Ishikawa, K; Inada, H; Abe, H

    1984-08-01

    A computer-assisted education system for arrhythmia (CAESAR) was developed for students to acquire the ability to logically diagnose complicated arrhythmias. This system has a logical simulator of cardiac rhythm using a mathematical model of the impulse formation and conduction system of the heart. A simulated arrhythmia (ECG pattern) is given on a graphic display unit with simulated series of the action potential of five pacemaker centers and the "ladder diagram" of impulse formation and conduction, which show the mechanism of that arrhythmia. For the purpose of the evaluation of this system, 13 medical students were given two types of tests concerning arrhythmias before and after 2-hr learning with this system. The scores they obtained after learning increased significantly from 73.3 +/- 11.9 to 93.2 +/- 3.0 (P less than 0.001) in one test and from 47.2 +/- 17.9 to 64.9 +/- 19.6 (P less than 0.001) in another one. These results proved that this CAI system is useful and effective for training ECG interpretation of arrhythmias.

  13. NGScloud: RNA-seq analysis of non-model species using cloud computing.

    Science.gov (United States)

    Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai

    2018-05-03

    RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.

  14. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  15. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  16. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  17. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  18. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  19. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  20. Computational analysis of cerebral cortex

    International Nuclear Information System (INIS)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni

    2010-01-01

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  1. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  2. STAF: A Powerful and Sophisticated CAI System.

    Science.gov (United States)

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  3. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  4. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  5. Report--COMOLA: A Computer System for the Analysis of Interlanguage Data.

    Science.gov (United States)

    Jagtman, Margriet; Bongaerts, Theo

    1994-01-01

    Discusses the design and use of the Computer Model for Language Acquisition (COMOLA), a computer program designed to analyze syntactic development in second-language learners by examining their oral utterances. Also compares COMOLA to the recently developed Computer-Aides Linguistic Analysis (COALA) program. (MDM)

  6. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  7. Propagation properties of the chirped Airy beams through the gradient-index medium

    Science.gov (United States)

    Feng, Liyan; Zhang, Jianbin; Pang, Zihao; Wang, Linyi; Zhong, Tianfen; Yang, Xiangbo; Deng, Dongmei

    2017-11-01

    Through analytical derivation and numerical analysis, the propagation properties of the chirped Airy(CAi) beams in the gradient-index medium are investigated. The intensity and the phase distributions, the propagation trajectory and the Poynting vector of the CAi beams are demonstrated to investigate the propagation properties. Owing to the special and symmetrical refractive index profile of the gradient-index medium, the CAi beams propagate periodically. The effects of the distribution factor and the chirped parameter on the propagation of the CAi beams are analyzed. As the increasing of the distribution factor, the intensity distribution of the CAi beams is more scattering. However, with the chirped parameter increasing, the focusing property of the CAi beams strengthens. The variation of the chirped parameter can change the position of the peak intensity maximum, but it cannot alter the period of the peak intensity. The variations of the initial phase and the energy of the beams in the transverse plane expedite accordingly.

  8. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  9. Computer-Assisted Instruction Case Study: The Introductory Marketing Course.

    Science.gov (United States)

    Skinner, Steven J.; Grimm, Jim L.

    1979-01-01

    Briefly reviews research on the effectiveness of CAI in instruction, and describes a study comparing the performance of students using one program for basic marketing--TRMP (Tutorial Review of Marketing Principles)--with or without a study guide, the study guide alone, and a traditional class. (BBM)

  10. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  11. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  12. Dietary exposure to aflatoxin B-1, ochratoxin A and fuminisins of adults in Lao Cai province, Viet Nam: A total dietary study approach

    DEFF Research Database (Denmark)

    Bui, Huong Mai; Le Danh Tuyen; Do Huu Tuan

    2016-01-01

    Aflatoxins, fumonisins and ochratoxin A that contaminate various agricultural commodities are considered of significant toxicity and potent human carcinogens. This study took a total dietary study approach and estimated the dietary exposure of these mycotoxins for adults living in Lao Cai province...... higher than recommended provisional tolerable daily intake (PTDI) values mainly due to contaminated cereals and meat. The exposure to total fumonisins (1400 ng/kg bw/day) was typically lower than the PTDI value (2000 ng/kg bw/day). The estimated risk of liver cancer associated with exposure to aflatoxin...... B1 was 2.7 cases/100,000 person/year. Margin of exposure (MOE) of renal cancer linked to ochratoxin A and liver cancer associated with fumonisins were 1124 and 1954, respectively indicating risk levels of public health concern. Further studies are needed to evaluate the efficiency of technical...

  13. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  14. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  15. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  16. Freeform surface measurement and characterisation using a toolmakers microscope

    International Nuclear Information System (INIS)

    Wong, Francis Seung-yin; Chauh, Kong-Bieng; Venuvinod, Patri K

    2014-01-01

    Current freeform surface (FFS) characterization systems mainly cover aspects related to computer-aided design/manufacture (CAD/CAM). This paper describes a new approach that extends into computer-aided inspection (CAI).The following novel features are addressed: - ◼ Feature recognition and extraction from surface data; - ◼ Characterisation of properties of the surface's M and N vectors at individual vertex; - ◼ Development of a measuring plan using a toolmakers microscope for the inspection of the FFS; - ◼ Inspection of the actual FFS produced by CNC milling; - ◼ Verification of the measurement results and comparison with the CAD design data; Tests have shown that the deviations between the CAI and CAD data were within the estimated uncertainty limits

  17. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  18. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  19. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  20. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  1. Aerodynamic analysis of Pegasus - Computations vs reality

    Science.gov (United States)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  2. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  3. A Theory-Based Exploration of Condomless Anal Intercourse Intention Among Young Men Who Have Sex with Men of Different Sexual Roles in Taiwan.

    Science.gov (United States)

    Chu, Jen-Hao; Huang, Jiun-Hau

    2017-11-28

    In recent years, men who have sex with men (MSM) have accounted for over 80% of all new HIV cases in Taiwan. More than 70% of new cases have occurred in those aged 15-34 years. Condomless anal intercourse (CAI) has been identified as the main route of HIV transmission among MSM. To systematically examine CAI intention and associated factors among young MSM in Taiwan, an anonymous online survey based on the Theory of Planned Behavior (TPB) was conducted. Data from 694 MSM aged 15-39 years were included in the analysis. This study found that, overall, all five TPB factors (i.e., attitudes toward positive and negative outcomes regarding CAI, perceived support for CAI from important others, and perceived behavioral control of CAI under facilitating and constraining conditions) were significantly associated with CAI intention. When data were stratified by sexual role (i.e., receptive, versatile, and insertive), the associations between TPB factors and CAI intention varied. Of the five TPB factors, positive attitudes toward positive outcomes regarding CAI were most strongly associated with high CAI intention (AOR 5.68 for all young MSM; AOR 3.80-15.93, depending on sexual role). Findings from this study could inform the development of theory-driven HIV prevention programs as well as future research and practice. These results also highlight the importance of tailoring HIV prevention initiatives for young MSM of different sexual roles to optimize the program effectiveness.

  4. Introducing remarks upon the analysis of computer systems performance

    International Nuclear Information System (INIS)

    Baum, D.

    1980-05-01

    Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de

  5. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  6. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  7. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  8. Analysis of Biosignals During Immersion in Computer Games.

    Science.gov (United States)

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  9. Computer methods for transient fluid-structure analysis of nuclear reactors

    International Nuclear Information System (INIS)

    Belytschko, T.; Liu, W.K.

    1985-01-01

    Fluid-structure interaction problems in nuclear engineering are categorized according to the dominant physical phenomena and the appropriate computational methods. Linear fluid models that are considered include acoustic fluids, incompressible fluids undergoing small disturbances, and small amplitude sloshing. Methods available in general-purpose codes for these linear fluid problems are described. For nonlinear fluid problems, the major features of alternative computational treatments are reviewed; some special-purpose and multipurpose computer codes applicable to these problems are then described. For illustration, some examples of nuclear reactor problems that entail coupled fluid-structure analysis are described along with computational results

  10. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  11. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  12. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration

  13. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/ USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration. (author)

  14. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  15. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  16. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  17. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  18. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  19. Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.

    Science.gov (United States)

    Kemp, Pavlina S; VanderVeen, Deborah K

    2016-01-01

    The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.

  20. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  1. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  2. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  3. Status of computer codes available in AEOI for reactor physics analysis

    International Nuclear Information System (INIS)

    Karbassiafshar, M.

    1986-01-01

    Many of the nuclear computer codes available in Atomic Energy Organization of Iran AEOI can be used for physics analysis of an operating reactor or design purposes. Grasp of the various methods involved and practical experience with these codes would be the starting point for interesting design studies or analysis of operating conditions of presently existing and future reactors. A review of the objectives and flowchart of commonly practiced procedures in reactor physics analysis of LWRs and related computer codes was made, extrapolating to the nationally and internationally available resources. Finally, effective utilization of the existing facilities is discussed and called upon

  4. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  5. Analysis of sponge zones for computational fluid mechanics

    International Nuclear Information System (INIS)

    Bodony, Daniel J.

    2006-01-01

    The use of sponge regions, or sponge zones, which add the forcing term -σ(q - q ref ) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer

  6. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  7. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  8. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis

    Directory of Open Access Journals (Sweden)

    Paul D. Morris, PhD

    2017-08-01

    Full Text Available Fractional flow reserve (FFR-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel “pseudotransient” analysis protocol for computing virtual fractional flow reserve (vFFR based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33% and more by microvascular physiology (59%. If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  9. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  10. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  11. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  12. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  13. Damage tolerance of candidate thermoset composites for use on single stage to orbit vehicles

    Science.gov (United States)

    Nettles, A. T.; Lance, D.; Hodge, A.

    1994-01-01

    Four fiber/resin systems were compared for resistance to damage and damage tolerance. One toughened epoxy and three toughened bismaleimide (BMI) resins were used, all with IM7 carbon fiber reinforcement. A statistical design of experiments technique was used to evaluate the effects of impact energy, specimen thickness, and impactor diameter on the damage area, as computed by C-scans, and residual compression-after-impact (CAI) strength. Results showed that two of the BMI systems sustained relatively large damage zones yet had an excellent retention of CAI strength.

  14. Computational Fatigue Life Analysis of Carbon Fiber Laminate

    Science.gov (United States)

    Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.

    2018-02-01

    In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.

  15. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  16. Growth modeling of Cryptomeria japonica by partial trunk analysis

    Directory of Open Access Journals (Sweden)

    Vinícius Morais Coutinho

    2017-06-01

    Full Text Available This study aimed to evaluate the growth pattern of Cryptomeria japonica increment (L. F. D. Don. and to describe the probability distribution in stands stablished at the municipality of Rio Negro, Paraná State. Twenty trees were sampled in a 34 years-old stand, with 3 m x 2 m spacing. Wood disks were taken from each tree at 1.3 m above the ground (DBH to perform partial stem analysis. Diameter growth series without bark were used to generate the average cumulative growth curves for DBH (cm, mean annual increment (MAI and current annual increment (CAI. From the increment data, the frequency distribution was evaluated by means of probability density functions (pdfs. The mean annual increment for DBH was 0.78 cm year-1 and the age of intersection of CAI and MAI curves was between the 7th and 8th years. It was found that near 43% of the species increments are concentrated bellow 0.5 cm. The results are useful to define appropriate management strategies for the species for sites similar to the studying regions, defining for example ages of silvicultural intervention, such as thinning.

  17. Non-LTE profiles of strong solar lines

    Science.gov (United States)

    Schneeberger, T. J.; Beebe, H. A.

    1976-01-01

    The complete linearization method is applied to the formation of strong lines in the solar atmosphere. Transitions in Na(I), Mg(I), Ca(I), Mg(II), and Ca(II) are computed with a standard atmosphere and microturbulent velocity model. The computed profiles are compared to observations at disk center.

  18. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  19. Computer image analysis of etched tracks from ionizing radiation

    Science.gov (United States)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  20. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  1. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  2. Computational content analysis of European Central Bank statements

    NARCIS (Netherlands)

    Milea, D.V.; Almeida, R.J.; Sharef, N.M.; Kaymak, U.; Frasincar, F.

    2012-01-01

    In this paper we present a framework for the computational content analysis of European Central Bank (ECB) statements. Based on this framework, we provide two approaches that can be used in a practical context. Both approaches use the content of ECB statements to predict upward and downward movement

  3. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific-analysis computer programs for year-2000 compliance is part of AECL' s year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  4. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific analysis computer programs for year-2000 compliance is part of AECL's year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  5. High fidelity thermal-hydraulic analysis using CFD and massively parallel computers

    International Nuclear Information System (INIS)

    Weber, D.P.; Wei, T.Y.C.; Brewster, R.A.; Rock, Daniel T.; Rizwan-uddin

    2000-01-01

    Thermal-hydraulic analyses play an important role in design and reload analysis of nuclear power plants. These analyses have historically relied on early generation computational fluid dynamics capabilities, originally developed in the 1960s and 1970s. Over the last twenty years, however, dramatic improvements in both computational fluid dynamics codes in the commercial sector and in computing power have taken place. These developments offer the possibility of performing large scale, high fidelity, core thermal hydraulics analysis. Such analyses will allow a determination of the conservatism employed in traditional design approaches and possibly justify the operation of nuclear power systems at higher powers without compromising safety margins. The objective of this work is to demonstrate such a large scale analysis approach using a state of the art CFD code, STAR-CD, and the computing power of massively parallel computers, provided by IBM. A high fidelity representation of a current generation PWR was analyzed with the STAR-CD CFD code and the results were compared to traditional analyses based on the VIPRE code. Current design methodology typically involves a simplified representation of the assemblies, where a single average pin is used in each assembly to determine the hot assembly from a whole core analysis. After determining this assembly, increased refinement is used in the hot assembly, and possibly some of its neighbors, to refine the analysis for purposes of calculating DNBR. This latter calculation is performed with sub-channel codes such as VIPRE. The modeling simplifications that are used involve the approximate treatment of surrounding assemblies and coarse representation of the hot assembly, where the subchannel is the lowest level of discretization. In the high fidelity analysis performed in this study, both restrictions have been removed. Within the hot assembly, several hundred thousand to several million computational zones have been used, to

  6. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  7. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  8. Investigation on structural analysis computer program of spent nuclear fuel shipping cask

    International Nuclear Information System (INIS)

    Yagawa, Ganki; Ikushima, Takeshi.

    1987-10-01

    This report describes the results done by the Sub-Committee of Research Cooperation Committee (RC-62) of the Japan Society of Mechanical Engineers under the trust of the Japan Atomic Energy Research Institute. The principal fulfilments and accomplishments are summarized as follows: (1) Regarding the survey of structural analysis methods of spent fuel shipping cask, several documents, which explain the features and applications of the exclusive computer programs for impact analysis on the basis of 2 or 3 dimensional finite element or difference methods such as HONDO, STEALTH and DYNA-3D, were reviewed. (2) In comparative evaluation of the existing computer programs, the common benchmark test problems for 9 m vertical drop impact of the axisymmetric lead cylinder with and without stainless steel clads were adopted where the calculational evaluations for taking into account the strain rate effect were carried out. (3) Evaluation of impact analysis algorithm of computer programs were conducted and the requirements for computer programs to be developed in future and an index for further studies have been clarified. (author)

  9. Wheeze sound analysis using computer-based techniques: a systematic review.

    Science.gov (United States)

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  10. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  11. Computational issues in the analysis of nonlinear two-phase flow dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Rosa, Mauricio A. Pinheiro [Centro Tecnico Aeroespacial (CTA-IEAv), Sao Jose dos Campos, SP (Brazil). Inst. de Estudos Avancados. Div. de Energia Nuclear], e-mail: pinheiro@ieav.cta.br; Podowski, Michael Z. [Rensselaer Polytechnic Institute, New York, NY (United States)

    2001-07-01

    This paper is concerned with the issue of computer simulations of flow-induced instabilities in boiling channels and systems. A computational model is presented for the time-domain analysis of nonlinear oscillations in interconnected parallel boiling channels. The results of model testing and validation are shown. One of the main concerns here has been to show the importance in performing numerical testing regarding the selection of a proper numerical integration method and associated nodalization and time step as well as to demonstrate the convergence of the numerical solution prior to any analysis. (author)

  12. Quality Assurance of Computer Programs for Photopeak Integration in Activation Analysis

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    1978-01-01

    The purpose of a computer program for quantitative activation analysis is basically to produce information on the ratio of radioactive decays of a specific radio-nuclide observed by a detector from two alternative sources. It is assumed that at least one of the sources is known to contain...... the radionuclide in question, and qualitative analysis is therefore needed only to the extent that the decay characteristics of this radionuclide could be confused with those of other possible radionuclides, thus interfering with its determination. The quality of these computer programs can only be assured...

  13. Computational image analysis of Suspension Plasma Sprayed YSZ coatings

    Directory of Open Access Journals (Sweden)

    Michalak Monika

    2017-01-01

    Full Text Available The paper presents the computational studies of microstructure- and topography- related features of suspension plasma sprayed (SPS coatings of yttria-stabilized zirconia (YSZ. The study mainly covers the porosity assessment, provided by ImageJ software analysis. The influence of boundary conditions, defined by: (i circularity and (ii size limits, on the computed values of porosity is also investigated. Additionally, the digital topography evaluation is performed: confocal laser scanning microscope (CLSM and scanning electron microscope (SEM operating in Shape from Shading (SFS mode measure surface roughness of deposited coatings. Computed values of porosity and roughness are referred to the variables of the spraying process, which influence the morphology of coatings and determines the possible fields of their applications.

  14. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  15. The application of computer image analysis in life sciences and environmental engineering

    Science.gov (United States)

    Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.

    2014-04-01

    The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.

  16. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  17. Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding

    International Nuclear Information System (INIS)

    Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis

    2003-01-01

    A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)

  18. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  19. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  20. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  1. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  2. Application of computer aided tolerance analysis in product design

    International Nuclear Information System (INIS)

    Du Hua

    2009-01-01

    This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)

  3. Efficacy of computer technology-based HIV prevention interventions: a meta-analysis.

    Science.gov (United States)

    Noar, Seth M; Black, Hulda G; Pierce, Larson B

    2009-01-02

    To conduct a meta-analysis of computer technology-based HIV prevention behavioral interventions aimed at increasing condom use among a variety of at-risk populations. Systematic review and meta-analysis of existing published and unpublished studies testing computer-based interventions. Meta-analytic techniques were used to compute and aggregate effect sizes for 12 randomized controlled trials that met inclusion criteria. Variables that had the potential to moderate intervention efficacy were also tested. The overall mean weighted effect size for condom use was d = 0.259 (95% confidence interval = 0.201, 0.317; Z = 8.74, P partners, and incident sexually transmitted diseases. In addition, interventions were significantly more efficacious when they were directed at men or women (versus mixed sex groups), utilized individualized tailoring, used a Stages of Change model, and had more intervention sessions. Computer technology-based HIV prevention interventions have similar efficacy to more traditional human-delivered interventions. Given their low cost to deliver, ability to customize intervention content, and flexible dissemination channels, they hold much promise for the future of HIV prevention.

  4. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  5. The President's Report, 1983-84.

    Science.gov (United States)

    Bok, Derek

    The 1983-84 annual report of the President of Harvard University to members of the Board of Overseers addresses the advantages and disadvantages of the utilization of new technologies by a university, comments on the instructional uses of computers (including computer assisted instruction (CAI)) and video technology, and cites specific examples in…

  6. Computer-controlled system for rapid soil analysis of 226Ra

    International Nuclear Information System (INIS)

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the 226 Ra concentration in soil samples using a 6 x 9-in. NaI(Tl) crystal containing a 3.25-in. deep by 3.5-in. diameter well. This gamma detection system is controlled by a mini-computer with a dual floppy disk storage medium. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing

  7. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  8. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  9. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  10. Fluid-Induced Vibration Analysis for Reactor Internals Using Computational FSI Method

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Jong Sung; Yi, Kun Woo; Sung, Ki Kwang; Im, In Young; Choi, Taek Sang [KEPCO E and C, Daejeon (Korea, Republic of)

    2013-10-15

    This paper introduces a fluid-induced vibration analysis method which calculates the response of the RVI to both deterministic and random loads at once and utilizes more realistic pressure distribution using the computational Fluid Structure Interaction (FSI) method. As addressed above, the FIV analysis for the RVI was carried out using the computational FSI method. This method calculates the response to deterministic and random turbulence loads at once. This method is also a simple and integrative method to get structural dynamic responses of reactor internals to various flow-induced loads. Because the analysis of this paper omitted the bypass flow region and Inner Barrel Assembly (IBA) due to the limitation of computer resources, it is necessary to find an effective way to consider all regions in the RV for the FIV analysis in the future. Reactor coolant flow makes Reactor Vessel Internals (RVI) vibrate and may affect the structural integrity of them. U. S. NRC Regulatory Guide 1.20 requires the Comprehensive Vibration Assessment Program (CVAP) to verify the structural integrity of the RVI for Fluid-Induced Vibration (FIV). The hydraulic forces on the RVI of OPR1000 and APR1400 were computed from the hydraulic formulas and the CVAP measurements in Palo Verde Unit 1 and Yonggwang Unit 4 for the structural vibration analyses. In this method, the hydraulic forces were divided into deterministic and random turbulence loads and were used for the excitation forces of the separate structural analyses. These forces are applied to the finite element model and the responses to them were combined into the resultant stresses.

  11. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part "creates" the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies....

  12. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul; Kolar, P.

    2000-01-01

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part 'creates' the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies, (C) 2000 Elsevier Science...

  13. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  14. CASKETSS: a computer code system for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1989-02-01

    A computer program CASKETSS has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS measn a modular code system for CASK Evaluation code system Thermal and Structural Safety. Main features of CASKETSS are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) Some of the computer programs in the code system has been programmed to provide near optimal speed on vector processing computers. (3) Data libralies fro thermal and structural analysis are provided in the code system. (4) Input data generator is provided in the code system. (5) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  15. Data processing of X-ray fluorescence analysis using an electronic computer

    International Nuclear Information System (INIS)

    Yakubovich, A.L.; Przhiyalovskij, S.M.; Tsameryan, G.N.; Golubnichij, G.V.; Nikitin, S.A.

    1979-01-01

    Considered are problems of data processing of multi-element (for 17 elements) X-ray fluorescence analysis of tungsten and molybdenum ores. The analysis was carried out using silicon-lithium spectrometer with the energy resolution of about 300 eV and a 1024-channel analyzer. A characteristic radiation of elements was excited with two 109 Cd radioisotope sources, their general activity being 10 mCi. The period of measurements was 400 s. The data obtained were processed with a computer using the ''Proba-1'' and ''Proba-2'' programs. Data processing algorithms and computer calculation results are presented

  16. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    Science.gov (United States)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  17. Gas analysis by computer-controlled microwave rotational spectrometry

    International Nuclear Information System (INIS)

    Hrubesh, L.W.

    1978-01-01

    Microwave rotational spectrometry has inherently high resolution and is thus nearly ideal for qualitative gas mixture analysis. Quantitative gas analysis is also possible by a simplified method which utilizes the ease with which molecular rotational transitions can be saturated at low microwave power densities. This article describes a computer-controlled microwave spectrometer which is used to demonstrate for the first time a totally automated analysis of a complex gas mixture. Examples are shown for a complete qualitative and quantitative analysis, in which a search of over 100 different compounds is made in less than 7 min, with sensitivity for most compounds in the 10 to 100 ppm range. This technique is expected to find increased use in view of the reduced complexity and increased reliabiity of microwave spectrometers and because of new energy-related applications for analysis of mixtures of small molecules

  18. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  19. Multicenter study of quantitative computed tomography analysis using a computer-aided three-dimensional system in patients with idiopathic pulmonary fibrosis.

    Science.gov (United States)

    Iwasawa, Tae; Kanauchi, Tetsu; Hoshi, Toshiko; Ogura, Takashi; Baba, Tomohisa; Gotoh, Toshiyuki; Oba, Mari S

    2016-01-01

    To evaluate the feasibility of automated quantitative analysis with a three-dimensional (3D) computer-aided system (i.e., Gaussian histogram normalized correlation, GHNC) of computed tomography (CT) images from different scanners. Each institution's review board approved the research protocol. Informed patient consent was not required. The participants in this multicenter prospective study were 80 patients (65 men, 15 women) with idiopathic pulmonary fibrosis. Their mean age was 70.6 years. Computed tomography (CT) images were obtained by four different scanners set at different exposures. We measured the extent of fibrosis using GHNC, and used Pearson's correlation analysis, Bland-Altman plots, and kappa analysis to directly compare the GHNC results with manual scoring by radiologists. Multiple linear regression analysis was performed to determine the association between the CT data and forced vital capacity (FVC). For each scanner, the extent of fibrosis as determined by GHNC was significantly correlated with the radiologists' score. In multivariate analysis, the extent of fibrosis as determined by GHNC was significantly correlated with FVC (p < 0.001). There was no significant difference between the results obtained using different CT scanners. Gaussian histogram normalized correlation was feasible, irrespective of the type of CT scanner used.

  20. Internet marketing directed at children on food and restaurant websites in two policy environments.

    Science.gov (United States)

    Kent, M Potvin; Dubois, L; Kent, E A; Wanless, A J

    2013-04-01

    Food and beverage marketing has been associated with childhood obesity yet little research has examined the influence of advertising policy on children's exposure to food/beverage marketing on the Internet. The purpose of this study was to assess the influence of Quebec's Consumer Protection Act and the self-regulatory Canadian Children's Food and Beverage Advertising Initiative (CAI) on food manufacturer and restaurant websites in Canada. A content analysis of 147 French and English language food and restaurant websites was undertaken. The presence of child-directed content was assessed and an analysis of marketing features, games and activities, child protection features, and the promotion of healthy lifestyle messages was then examined on those sites with child-directed content. There were statistically no fewer French language websites (n = 22) with child-directed content compared to English language websites (n = 27). There were no statistically significant differences in the number of the various marketing features, or in the average number of marketing features between the English and French websites. There were no fewer CAI websites (n = 14) with child-directed content compared to non-CAI websites (n = 13). The CAI sites had more healthy lifestyle messages and child protection features compared to the non-CAI sites. Systematic surveillance of the Consumer Protection Act in Quebec is recommended. In the rest of Canada, the CAI needs to be significantly expanded or replaced by regulatory measures to adequately protect children from the marketing of foods/beverages high in fat, sugar, and sodium on the Internet. Copyright © 2012 The Obesity Society.

  1. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  2. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  3. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  4. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  5. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    Ito, Masahiro; Uwaba, Tomoyuki

    2005-04-01

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  6. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  7. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    . This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated......-principles models have been investigated with respect to design and operational issues for solution copolymerization reactors in general, and for the methyl methacrylate/vinyl acetate system in particular. The Model 1 is taken from literature and is commonly used for low conversion region, while the Model 2 has...

  8. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    Science.gov (United States)

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.

  9. INFN-Pisa scientific computation environment (GRID, HPC and Interactive Analysis)

    International Nuclear Information System (INIS)

    Arezzini, S; Carboni, A; Caruso, G; Ciampa, A; Coscetti, S; Mazzoni, E; Piras, S

    2014-01-01

    The INFN-Pisa Tier2 infrastructure is described, optimized not only for GRID CPU and Storage access, but also for a more interactive use of the resources in order to provide good solutions for the final data analysis step. The Data Center, equipped with about 6700 production cores, permits the use of modern analysis techniques realized via advanced statistical tools (like RooFit and RooStat) implemented in multicore systems. In particular a POSIX file storage access integrated with standard SRM access is provided. Therefore the unified storage infrastructure is described, based on GPFS and Xrootd, used both for SRM data repository and interactive POSIX access. Such a common infrastructure allows a transparent access to the Tier2 data to the users for their interactive analysis. The organization of a specialized many cores CPU facility devoted to interactive analysis is also described along with the login mechanism integrated with the INFN-AAI (National INFN Infrastructure) to extend the site access and use to a geographical distributed community. Such infrastructure is used also for a national computing facility in use to the INFN theoretical community, it enables a synergic use of computing and storage resources. Our Center initially developed for the HEP community is now growing and includes also HPC resources fully integrated. In recent years has been installed and managed a cluster facility (1000 cores, parallel use via InfiniBand connection) and we are now updating this facility that will provide resources for all the intermediate level HPC computing needs of the INFN theoretical national community.

  10. 16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT)

    CERN Document Server

    Lokajicek, M; Tumova, N

    2015-01-01

    16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT). The ACAT workshop series, formerly AIHENP (Artificial Intelligence in High Energy and Nuclear Physics), was created back in 1990. Its main purpose is to gather researchers related with computing in physics research together, from both physics and computer science sides, and bring them a chance to communicate with each other. It has established bridges between physics and computer science research, facilitating the advances in our understanding of the Universe at its smallest and largest scales. With the Large Hadron Collider and many astronomy and astrophysics experiments collecting larger and larger amounts of data, such bridges are needed now more than ever. The 16th edition of ACAT aims to bring related researchers together, once more, to explore and confront the boundaries of computing, automatic data analysis and theoretical calculation technologies. It will create a forum for exchanging ideas among the fields an...

  11. Application of computational aerodynamics methods to the design and analysis of transport aircraft

    Science.gov (United States)

    Da Costa, A. L.

    1978-01-01

    The application and validation of several computational aerodynamic methods in the design and analysis of transport aircraft is established. An assessment is made concerning more recently developed methods that solve three-dimensional transonic flow and boundary layers on wings. Capabilities of subsonic aerodynamic methods are demonstrated by several design and analysis efforts. Among the examples cited are the B747 Space Shuttle Carrier Aircraft analysis, nacelle integration for transport aircraft, and winglet optimization. The accuracy and applicability of a new three-dimensional viscous transonic method is demonstrated by comparison of computed results to experimental data

  12. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  13. Investigation on structural analysis computer program of spent nuclear fuel shipping cask, (2)

    International Nuclear Information System (INIS)

    Yagawa, Ganki; Ikushima, Takeshi.

    1987-10-01

    This report describes the results (II) done by the Sub-Committee of Research Cooperation Committee (RC-62) of the Japan Society of Mechanical Engineers under the trust of the Japan Atomic Energy Research Institute. The principal fulfilments and accomplishments are summarized as follows: (1) Regarding the survey of structural analysis methods of spent fuel shipping cask, several documents, which explain the features and applications of the exclusive computer programs for impact analysis on the basis of 2 or 3 dimensional finite element or difference methods, were reviewed. (2) In comparative evaluation of the existing computer programs, the common benchmark test problems for drop impact of the axisymmetric cylinder and plate were adopted the calculational evaluations for taking into account the strain rate effect of material properties, effect of artificial viscosity and effect of time integration step size were carried out. (3) Evaluation of impact analysis algorithm of computer programs were conducted and the requirements for computer programs to be developed in future and an index for further studies have been clarified. (author)

  14. SALP-3: A computer program for fault-tree analysis. Description and how-to-use. (Sensitivity analysis by list processing)

    International Nuclear Information System (INIS)

    Contini, S.; Astolfi, M.; Muysenberg, C.L. van den; Volta, G.

    1979-01-01

    The main characteristics and the how-to-use of the computer program SALP-3 for the analysis of coherent systems are described. The program is writen in PL/1 for the IBM/370-165. A syntactic analysis is made for the imput (fault-tree and data) and appropriate messages are supplied, should and error take place. The significant minimal cut sets (MCS) are searched by the use of algorithms based on the direct manipulation of the tree. The MCS, of whichever order, are supplied in output in order of importance with reference to a given probability threshold. The computer program SALP-3 represents only the intermediate results of a project whose objective is the implementation of a computer program for the analysis of both coherent and non-coherent structure functions, and, finally, for the automatic event tree analysis. The last part of the report illustrates the developments regarding the improvement in progress

  15. Evaluation of Cloud Computing Hidden Benefits by Using Real Options Analysis

    Directory of Open Access Journals (Sweden)

    Pavel Náplava

    2016-12-01

    Full Text Available Cloud computing technologies have brought new attributes to the IT world. One of them is a flexibility of IT resources. It enables effectively both to downsize and upsize the capacity of IT resources in real time. Requirements for IT size change defines business strategy and actual market state. IT costs are not stable but dynamic in this case. Standard investment valuation methods (both static and dynamic are not able to include the flexibility attribute to the evaluation of IT projects. This article describes the application of the Real Options Analysis method for the valuation of the cloud computing flexibility. The method compares costs of the on-premise and cloud computing solutions by combining put and call option valuation. Cloud computing providers can use the method as an advanced tool that explains hidden benefits of cloud computing. Unexperienced cloud computing customers can simulate the market behavior and better plan necessary IT investments.

  16. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  17. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  18. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  19. Parallel computation of aerodynamic influence coefficients for aeroelastic analysis on a transputer network

    Science.gov (United States)

    Janetzke, D. C.; Murthy, D. V.

    1991-01-01

    Aeroelastic analysis is mult-disciplinary and computationally expensive. Hence, it can greatly benefit from parallel processing. As part of an effort to develop an aeroelastic analysis capability on a distributed-memory transputer network, a parallel algorithm for the computation of aerodynamic influence coefficients is implemented on a network of 32 transputers. The aerodynamic influence coefficients are calculated using a three-dimensional unsteady aerodynamic model and a panel discretization. Efficiencies up to 85 percent are demonstrated using 32 processors. The effects of subtask ordering, problem size and network topology are presented. A comparison to results on a shared-memory computer indicates that higher speedup is achieved on the distributed-memory system.

  20. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  1. Intra-articualr calcaneal fractures: Computed tomographic analysis

    International Nuclear Information System (INIS)

    Rosenberg, Z.S.; Feldman, F.; Singson, R.D.

    1987-01-01

    Computed tomography (CT) analysis of 21 intra-articular calcaneal fractures categorized according to the Essex-Lopresti classification revealed the following distribution: joint depression-type 57%, comminuted type 43%, tongue-type 0%. The posterior calcaneal facet was fractured and/or depressed in 100% of the cases while the medial facet was involved in only 25% of the cases. CT proved superior to plain films by consistently demonstrating additional fracture components within each major category suggesting subclassifications which have potential prognostic value. CT allowed more expeditious handling of acutely injured patients, and improved preoperative planning, postoperative follow-up, and detailed analysis of causes for chronic residual pain. CT further identified significant soft tissue injuries such as peroneal tendon displacement which cannot be delineated on plain films. (orig.)

  2. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  3. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  4. Linear stability analysis of detonations via numerical computation and dynamic mode decomposition

    KAUST Repository

    Kabanov, Dmitry I.

    2017-12-08

    We introduce a new method to investigate linear stability of gaseous detonations that is based on an accurate shock-fitting numerical integration of the linearized reactive Euler equations with a subsequent analysis of the computed solution via the dynamic mode decomposition. The method is applied to the detonation models based on both the standard one-step Arrhenius kinetics and two-step exothermic-endothermic reaction kinetics. Stability spectra for all cases are computed and analyzed. The new approach is shown to be a viable alternative to the traditional normal-mode analysis used in detonation theory.

  5. Linear stability analysis of detonations via numerical computation and dynamic mode decomposition

    KAUST Repository

    Kabanov, Dmitry; Kasimov, Aslan R.

    2018-01-01

    We introduce a new method to investigate linear stability of gaseous detonations that is based on an accurate shock-fitting numerical integration of the linearized reactive Euler equations with a subsequent analysis of the computed solution via the dynamic mode decomposition. The method is applied to the detonation models based on both the standard one-step Arrhenius kinetics and two-step exothermic-endothermic reaction kinetics. Stability spectra for all cases are computed and analyzed. The new approach is shown to be a viable alternative to the traditional normal-mode analysis used in detonation theory.

  6. Linear stability analysis of detonations via numerical computation and dynamic mode decomposition

    KAUST Repository

    Kabanov, Dmitry

    2018-03-20

    We introduce a new method to investigate linear stability of gaseous detonations that is based on an accurate shock-fitting numerical integration of the linearized reactive Euler equations with a subsequent analysis of the computed solution via the dynamic mode decomposition. The method is applied to the detonation models based on both the standard one-step Arrhenius kinetics and two-step exothermic-endothermic reaction kinetics. Stability spectra for all cases are computed and analyzed. The new approach is shown to be a viable alternative to the traditional normal-mode analysis used in detonation theory.

  7. A computer-controlled system for rapid soil analysis of 226Ra

    International Nuclear Information System (INIS)

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities (RASA) Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the 226 Ra concentration in soil samples using a 6 x 9 inch NaI(T1) crystal containing a 3.25 inch deep by 3.5 inch diameter well. This gamma detection system is controlled by a minicomputer with a dual floppy disk storage medium, line printer, and optional X-Y plotter. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing. The computer system is a Commodore Business Machines (CBM) Model 8032 personal computer with CBM peripherals. Control and data signals are utilized via the parallel user's port to the interface unit. The analog-to-digital converter (ADC) is controlled in machine language, bootstrapped to high memory, and is addressed through the BASIC program. The BASIC program is designed to be ''user friendly'' and provides the operator with several modes of operation such as background and analysis acquisition. Any number of energy regions-of-interest (ROI) may be analyzed with automatic background substraction. Also employed in the BASIC program are the 226 Ra algorithms which utilize linear and polynomial regression equations for data conversion and look-up tables for radon equilibrating coefficients. The optional X-Y plotter may be used with two- or three-dimensional curve programs to enhance data analysis and presentation. A description of the system is presented and typical applications are discussed

  8. First results from a combined analysis of CERN computing infrastructure metrics

    Science.gov (United States)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  9. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  10. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  11. Available computer codes and data for radiation transport analysis

    International Nuclear Information System (INIS)

    Trubey, D.K.; Maskewitz, B.F.; Roussin, R.W.

    1975-01-01

    The Radiation Shielding Information Center (RSIC), sponsored and supported by the Energy Research and Development Administration (ERDA) and the Defense Nuclear Agency (DNA), is a technical institute serving the radiation transport and shielding community. It acquires, selects, stores, retrieves, evaluates, analyzes, synthesizes, and disseminates information on shielding and ionizing radiation transport. The major activities include: (1) operating a computer-based information system and answering inquiries on radiation analysis, (2) collecting, checking out, packaging, and distributing large computer codes, and evaluated and processed data libraries. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  12. Computational analysis in support of the SSTO flowpath test

    Science.gov (United States)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-10-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  13. Use of electronic computers for processing of spectrometric data in instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.

    1977-01-01

    A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered

  14. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  15. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  16. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  17. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  18. A full automatic system controlled with IBM-PC/XT micro-computer for neutron activation analysis

    International Nuclear Information System (INIS)

    Song Quanxun

    1992-01-01

    A full automatic system controlled with micro-computers for NAA is described. All processes are automatically completed with an IBM-PC/XT micro-computer. The device is stable, reliable, flexible and convenient for use and has many functions and applications in automatical analysis of long, middle and short lived nuclides. Due to a high working efficiency of the instrument and micro-computers, both time and power can be saved. This method can be applied in other nuclear analysis techniques

  19. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  20. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  1. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    Science.gov (United States)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  2. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  3. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  4. Forest value orientations in Australia: an application of computer content analysis

    Science.gov (United States)

    Trevor J. Webb; David N. Bengston; David P. Fan

    2008-01-01

    This article explores the expression of three forest value orientations that emerged from an analysis of Australian news media discourse about the management of Australian native forests from August 1, 1997 through December 31, 2004. Computer-coded content analysis was used to measure and track the relative importance of commodity, ecological and moral/spiritual/...

  5. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  6. A compendium of computer codes in fault tree analysis

    International Nuclear Information System (INIS)

    Lydell, B.

    1981-03-01

    In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)

  7. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  8. A New Computationally Frugal Method For Sensitivity Analysis Of Environmental Models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A.; Teuling, R.; Borgonovo, E.; Uijlenhoet, R.

    2013-12-01

    Effective and efficient parameter sensitivity analysis methods are crucial to understand the behaviour of complex environmental models and use of models in risk assessment. This paper proposes a new computationally frugal method for analyzing parameter sensitivity: the Distributed Evaluation of Local Sensitivity Analysis (DELSA). The DELSA method can be considered a hybrid of local and global methods, and focuses explicitly on multiscale evaluation of parameter sensitivity across the parameter space. Results of the DELSA method are compared with the popular global, variance-based Sobol' method and the delta method. We assess the parameter sensitivity of both (1) a simple non-linear reservoir model with only two parameters, and (2) five different "bucket-style" hydrologic models applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both the synthetic and real-world examples, the global Sobol' method and the DELSA method provide similar sensitivities, with the DELSA method providing more detailed insight at much lower computational cost. The ability to understand how sensitivity measures vary through parameter space with modest computational requirements provides exciting new opportunities.

  9. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  10. Deviations in gait metrics in patients with chronic ankle instability: a case control study.

    Science.gov (United States)

    Gigi, Roy; Haim, Amir; Luger, Elchanan; Segal, Ganit; Melamed, Eyal; Beer, Yiftah; Nof, Matityahu; Nyska, Meir; Elbaz, Avi

    2015-01-01

    Gait metric alterations have been previously reported in patients suffering from chronic ankle instability (CAI). Previous studies of gait in this population have been comprised of relatively small cohorts, and the findings of these studies are not uniform. The objective of the present study was to examine spatiotemporal gait metrics in patients with CAI and examine the relationship between self-reported disease severity and the magnitude of gait abnormalities. Forty-four patients with CAI were identified and compared to 53 healthy controls. Patients were evaluated with spatiotemporal gait analysis via a computerized mat and with the Short Form (SF) - 36 health survey. Patients with CAI were found to walk with approximately 16% slower walking velocity, 9% lower cadence and approximately 7% lower step length. Furthermore, the base of support, during walking, in the CAI group was approximately 43% wider, and the single limb support phase was 3.5% shorter compared to the control group. All of the SF-36 8-subscales, as well as the SF-36 physical component summary and SF-36 mental component summary, were significantly lower in patients with CAI compared to the control group. Finally, significant correlations were found between most of the objective gait measures and the SF-36 mental component summary and SF-36 physical component summary. The results outline a gait profile for patients suffering from CAI. Significant differences were found in most spatiotemporal gait metrics. An important finding was a significantly wider base of support. It may be speculated that these gait alterations may reflect a strategy to deal with imbalance and pain. These findings suggest the usefulness of gait metrics, alongside with the use of self-evaluation questionnaires, in assessing disease severity of patients with CAI.

  11. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...

  12. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  13. Predicting Manual Therapy Treatment Success in Patients With Chronic Ankle Instability: Improving Self-Reported Function.

    Science.gov (United States)

    Wikstrom, Erik A; McKeon, Patrick O

    2017-04-01

      Therapeutic modalities that stimulate sensory receptors around the foot-ankle complex improve chronic ankle instability (CAI)-associated impairments. However, not all patients have equal responses to these modalities. Identifying predictors of treatment success could improve clinician efficiency when treating patients with CAI.   To conduct a response analysis on existing data to identify predictors of improved self-reported function in patients with CAI.   Secondary analysis of a randomized controlled clinical trial.   Sports medicine research laboratories.   Fifty-nine patients with CAI, which was defined in accordance with the International Ankle Consortium recommendations.   Participants were randomized into 3 treatment groups (plantar massage [PM], ankle-joint mobilization [AJM], or calf stretching [CS]) that received six 5-minute treatments over 2 weeks.   Treatment success, defined as a patient exceeding the minimally clinically important difference of the Foot and Ankle Ability Measure-Sport (FAAM-S).   Patients with ≤5 recurrent sprains and ≤82.73% on the Foot and Ankle Ability Measure had a 98% probability of having a meaningful FAAM-S improvement after AJM. As well, ≥5 balance errors demonstrated 98% probability of meaningful FAAM-S improvements from AJM. Patients <22 years old and with ≤9.9 cm of dorsiflexion had a 99% probability of a meaningful FAAM-S improvement after PM. Also, those who made ≥2 single-limb-stance errors had a 98% probability of a meaningful FAAM-S improvement from PM. Patients with ≤53.1% on the FAAM-S had an 83% probability of a meaningful FAAM-S improvement after CS.   Each sensory-targeted ankle-rehabilitation strategy resulted in a unique combination of predictors of success for patients with CAI. Specific indicators of success with AJM were deficits in self-reported function, single-limb balance, and <5 previous sprains. Age, weight-bearing-dorsiflexion restrictions, and single-limb balance

  14. MultiSIMNRA: A computational tool for self-consistent ion beam analysis using SIMNRA

    International Nuclear Information System (INIS)

    Silva, T.F.; Rodrigues, C.L.; Mayer, M.; Moro, M.V.; Trindade, G.F.; Aguirre, F.R.; Added, N.; Rizzutto, M.A.; Tabacniks, M.H.

    2016-01-01

    Highlights: • MultiSIMNRA enables the self-consistent analysis of multiple ion beam techniques. • Self-consistent analysis enables unequivocal and reliable modeling of the sample. • Four different computational algorithms available for model optimizations. • Definition of constraints enables to include prior knowledge into the analysis. - Abstract: SIMNRA is widely adopted by the scientific community of ion beam analysis for the simulation and interpretation of nuclear scattering techniques for material characterization. Taking advantage of its recognized reliability and quality of the simulations, we developed a computer program that uses multiple parallel sessions of SIMNRA to perform self-consistent analysis of data obtained by different ion beam techniques or in different experimental conditions of a given sample. In this paper, we present a result using MultiSIMNRA for a self-consistent multi-elemental analysis of a thin film produced by magnetron sputtering. The results demonstrate the potentialities of the self-consistent analysis and its feasibility using MultiSIMNRA.

  15. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    Science.gov (United States)

    Adib, M. A. H. M.; Adnan, F.; Ismail, A. R.; Kardigama, K.; Salaam, H. A.; Ahmad, Z.; Johari, N. H.; Anuar, Z.; Azmi, N. S. N.

    2012-09-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ~ 60%) acceptable compared to diffuser with 6mm ~ 40% and 12mm ~ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  16. Information Infrastructures for Integrated Enterprises

    Science.gov (United States)

    1993-05-01

    PROCESSING demographic CAM realization; ule leveling; studies; prelimi- rapid tooling; con- accounting/admin- nary CAFE and tinuous cost istrative reports...nies might consider franchising some facets of indirect labor, such as selected functions of administration, finance, and human resources. Incorporate as...vices CAFE Corporate Average Fuel Economy CAD Computer-Aided Design 0 CAE Computer-Aided Engineering CAIS Common Ada Programming Support Environment

  17. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    International Nuclear Information System (INIS)

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given

  18. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  19. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  20. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  1. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  2. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Wang, Aolin; Arah, Onyebuchi A.

    2015-01-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  3. Sequential designs for sensitivity analysis of functional inputs in computer experiments

    International Nuclear Information System (INIS)

    Fruth, J.; Roustant, O.; Kuhnt, S.

    2015-01-01

    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. - Highlights: • Sensitivity analysis method for functional and scalar inputs is presented. • We focus on the discovery of most influential parts of the functional domain. • We investigate economical sequential methodology based on piecewise constant functions. • Normalized sensitivity indices are introduced and investigated theoretically. • Successful application to sheet metal forming on two functional inputs

  4. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  5. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  6. An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences

    Science.gov (United States)

    Inchamnan, Wilawan

    2016-01-01

    This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…

  7. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  8. AN INTERACTIVE WEB-BASED ANALYSIS FRAMEWORK FOR REMOTE SENSING CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Z. Wang

    2015-07-01

    Full Text Available Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users’ private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook

  9. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    Science.gov (United States)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write

  10. Maintaining scale as a realiable computational system for criticality safety analysis

    International Nuclear Information System (INIS)

    Bowmann, S.M.; Parks, C.V.; Martin, S.K.

    1995-01-01

    Accurate and reliable computational methods are essential for nuclear criticality safety analyses. The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer code system was originally developed at Oak Ridge National Laboratory (ORNL) to enable users to easily set up and perform criticality safety analyses, as well as shielding, depletion, and heat transfer analyses. Over the fifteen-year life of SCALE, the mainstay of the system has been the criticality safety analysis sequences that have featured the KENO-IV and KENO-V.A Monte Carlo codes and the XSDRNPM one-dimensional discrete-ordinates code. The criticality safety analysis sequences provide automated material and problem-dependent resonance processing for each criticality calculation. This report details configuration management which is essential because SCALE consists of more than 25 computer codes (referred to as modules) that share libraries of commonly used subroutines. Changes to a single subroutine in some cases affect almost every module in SCALE exclamation point Controlled access to program source and executables and accurate documentation of modifications are essential to maintaining SCALE as a reliable code system. The modules and subroutine libraries in SCALE are programmed by a staff of approximately ten Code Managers. The SCALE Software Coordinator maintains the SCALE system and is the only person who modifies the production source, executables, and data libraries. All modifications must be authorized by the SCALE Project Leader prior to implementation

  11. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  12. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  13. Enhanced computational infrastructure for data analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; McHarg, B.B.; Meyer, W.H.; Parker, C.T.

    2000-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from nine national laboratories, 19 foreign laboratories, 16 universities, and five industrial partnerships. As a result of this work, DIII-D data is available on a 24x7 basis from a set of viewing and analysis tools that can be run on either the collaborators' or DIII-D's computer systems. Additionally, a web based data and code documentation system has been created to aid the novice and expert user alike

  14. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.; McCharg, B.B.

    1999-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike

  15. Use of personal computers in performing a linear modal analysis of a large finite-element model

    International Nuclear Information System (INIS)

    Wagenblast, G.R.

    1991-01-01

    This paper presents the use of personal computers in performing a dynamic frequency analysis of a large (2,801 degrees of freedom) finite-element model. Large model linear time history dynamic evaluations of safety related structures were previously restricted to mainframe computers using direct integration analysis methods. This restriction was a result of the limited memory and speed of personal computers. With the advances in memory capacity and speed of the personal computers, large finite-element problems now can be solved in the office in a timely and cost effective manner. Presented in three sections, this paper describes the procedure used to perform the dynamic frequency analysis of the large (2,801 degrees of freedom) finite-element model on a personal computer. Section 2.0 describes the structure and the finite-element model that was developed to represent the structure for use in the dynamic evaluation. Section 3.0 addresses the hardware and software used to perform the evaluation and the optimization of the hardware and software operating configuration to minimize the time required to perform the analysis. Section 4.0 explains the analysis techniques used to reduce the problem to a size compatible with the hardware and software memory capacity and configuration

  16. The impact of changing computing technology on EPRI [Electric Power Research Institute] nuclear analysis codes

    International Nuclear Information System (INIS)

    Breen, R.J.

    1988-01-01

    The Nuclear Reload Management Program of the Nuclear Power Division (NPD) of the Electric Power Research Institute (EPRI) has the responsibility for initiating and managing applied research in selected nuclear engineering analysis functions for nuclear utilities. The computer systems that result from the research projects consist of large FORTRAN programs containing elaborate computational algorithms used to access such areas as core physics, fuel performance, thermal hydraulics, and transient analysis. This paper summarizes a study of computing technology trends sponsored by the NPD. The approach taken was to interview hardware and software vendors, industry observers, and utility personnel focusing on expected changes that will occur in the computing industry over the next 3 to 5 yr. Particular emphasis was placed on how these changes will impact engineering/scientific computer code development, maintenance, and use. In addition to the interviews, a workshop was held with attendees from EPRI, Power Computing Company, industry, and utilities. The workshop provided a forum for discussing issues and providing input into EPRI's long-term computer code planning process

  17. Markov analysis of different standby computer based systems

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Mohan, Nalini; Ghadge, S.G.; Bajaj, S.S.

    2006-01-01

    As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR

  18. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  19. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  20. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  1. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    Science.gov (United States)

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  2. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    Science.gov (United States)

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  3. Computer-aided System of Semantic Text Analysis of a Technical Specification

    OpenAIRE

    Zaboleeva-Zotova, Alla; Orlova, Yulia

    2008-01-01

    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated the model of the analysis of the text of the technical project is submitted, the attribute grammar of a technical specification, intended for formalization of limited Ru...

  4. Paradoxical effects of KB-R7943 on arrhythmogenicity in a chronic myocardial infarction rabbit model.

    Science.gov (United States)

    Chang, Po-Cheng; Wo, Hung-Ta; Lee, Hui-Ling; Wen, Ming-Shien; Chou, Chung-Chuan

    2015-07-01

    Na(+)/Ca(2+) exchanger blockade has been reported to be anti-arrhythmic in different models. The effects of KB-R7943, a Na(+)/Ca(2+) exchanger blocker, on arrhythmogenesis in hearts with chronic myocardial infarction (MI) remain unclear. Dual voltage and intracellular Ca(2+) (Cai) optical mapping was performed in nine rabbit hearts with chronic MI and four control hearts. Electrophysiology studies including inducibility of ventricular tachyarrhythmias, ventricular fibrillation dominant frequency, action potential, Cai alternans, Cai decay, and conduction velocity were performed. The same protocol was repeated in the presence of KB-R7943 (0.5, 1, and 5μM) after the baseline studies. KB-R7943 was effective in suppressing afterdepolarizations and spontaneous ventricular tachyarrhythmias in hearts with chronic MI. Surprisingly, KB-R7943 increased the inducibility of ventricular tachyarrhythmias in a dose-dependent manner (11%, 11%, 22%, and 56% at baseline and with 0.5, 1, and 5μM KB-R7943, respectively, p=0.02). Optical mapping analysis revealed that the underlying mechanisms of the induced ventricular tachyarrhythmias were probably spatially discordant alternans with wave breaks and rotors. Further analysis showed that KB-R7943 significantly enhanced both action potential (p=0.033) and Cai (p=0.001) alternans, prolonged Cai decay (tau value) in a dose-dependent manner (p=0.004), and caused heterogeneous conduction delay especially at peri-infarct zones during rapid burst pacing. In contrast, KB-R7943 had insignificant effects in control hearts. In this chronic MI rabbit model, KB-R7943 has contrasting effects on arrhythmogenesis, suppressing afterdepolarizations and spontaneous ventricular tachyarrhythmias, but enhancing the inducibility of tachyarrhythmias. The mechanism is probably the enhanced spatially discordant alternans because of prolonged Cai decay and heterogeneous conduction delay. Copyright © 2014 Japanese College of Cardiology. Published by Elsevier

  5. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    International Nuclear Information System (INIS)

    Adib, M A H M; Ismail, A R; Kardigama, K; Salaam, H A; Ahmad, Z; Johari, N H; Anuar, Z; Azmi, N S N; Adnan, F

    2012-01-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ∼ 60%) acceptable compared to diffuser with 6mm ∼ 40% and 12mm ∼ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  6. Compendium of computer codes for the safety analysis of LMFBR's

    International Nuclear Information System (INIS)

    1975-06-01

    A high level of mathematical sophistication is required in the safety analysis of LMFBR's to adequately meet the demands for realism and confidence in all areas of accident consequence evaluation. The numerical solution procedures associated with these analyses are generally so complex and time consuming as to necessitate their programming into computer codes. These computer codes have become extremely powerful tools for safety analysis, combining unique advantages in accuracy, speed and cost. The number, diversity and complexity of LMFBR safety codes in the U. S. has grown rapidly in recent years. It is estimated that over 100 such codes exist in various stages of development throughout the country. It is inevitable that such a large assortment of codes will require rigorous cataloguing and abstracting to aid individuals in identifying what is available. It is the purpose of this compendium to provide such a service through the compilation of code summaries which describe and clarify the status of domestic LMFBR safety codes. (U.S.)

  7. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    Science.gov (United States)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  8. POST-CASKETSS: a graphic computer program for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-12-01

    A computer program POST-CASKETSS has been developed for the purpose of calculation result representation for thermal and structural analysis computer code system CASKETSS (CASKETSS means a modular code system for CASK Evaluation code system for Thermal and Structural Safety). Main features of POST-CASKETSS are as follows; (1) Function of calculation result representation for thermal and structural analysis computer programs is provided in the program. (2) Two and three dimensional graphic representation for finite element and finite difference programs are available in the program. (3) The capacity of graphics of geometry, temperature contor and temperature-time curve are provided for thermal analysis. (4) The capacity of graphics of geometry, deformation, stress contor, displacement-time curve, velocity-time curve, acceleration-time curve, stress-time curve, force-time curve and moment-time curve are provided for structural analysis. (5) This computer program operates both the time shearing system and the batch system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  9. Cluster Analysis of Flow Cytometric List Mode Data on a Personal Computer

    NARCIS (Netherlands)

    Bakker Schut, Tom C.; Bakker schut, T.C.; de Grooth, B.G.; Greve, Jan

    1993-01-01

    A cluster analysis algorithm, dedicated to analysis of flow cytometric data is described. The algorithm is written in Pascal and implemented on an MS-DOS personal computer. It uses k-means, initialized with a large number of seed points, followed by a modified nearest neighbor technique to reduce

  10. SCDAP: a light water reactor computer code for severe core damage analysis

    International Nuclear Information System (INIS)

    Marino, G.P.; Allison, C.M.; Majumdar, D.

    1982-01-01

    Development of the first code version (MODO) of the Severe Core Damage Analysis Package (SCDAP) computer code is described, and calculations made with SCDAP/MODO are presented. The objective of this computer code development program is to develop a capability for analyzing severe disruption of a light water reactor core, including fuel and cladding liquefaction, flow, and freezing; fission product release; hydrogen generation; quenched-induced fragmentation; coolability of the resulting geometry; and ultimately vessel failure due to vessel-melt interaction. SCDAP will be used to identify the phenomena which control core behavior during a severe accident, to help quantify uncertainties in risk assessment analysis, and to support planning and evaluation of severe fuel damage experiments and data. SCDAP/MODO addresses the behavior of a single fuel bundle. Future versions will be developed with capabilities for core-wide and vessel-melt interaction analysis

  11. Computer-controlled on-line gamma analysis for krypton-85

    International Nuclear Information System (INIS)

    Canuette, R.P.

    1980-03-01

    85 Kr will be evolved from spent nuclear fuel during both the voloxidation and dissolution processes, so a reliable method for on-line analysis of 85 Kr in the off-gas system is needed. Tritium, 14 C, and 129 I were trapped, and the activity of 85 Kr was then measured using a Li-drifted Ge detector. Equipment used to carry out this analysis is described; the PET computer is used. The 85 Kr evolution rate was correlated with the fuel dissolution rate; the close correlation permits one to monitor the fuel dissolution process. 11 figures

  12. A visual interface to computer programs for linkage analysis.

    Science.gov (United States)

    Chapman, C J

    1990-06-01

    This paper describes a visual approach to the input of information about human families into computer data bases, making use of the GEM graphic interface on the Atari ST. Similar approaches could be used on the Apple Macintosh or on the IBM PC AT (to which it has been transferred). For occasional users of pedigree analysis programs, this approach has considerable advantages in ease of use and accessibility. An example of such use might be the analysis of risk in families with Huntington disease using linked RFLPs. However, graphic interfaces do make much greater demands on the programmers of these systems.

  13. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    Science.gov (United States)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  14. Modulation of the Fibularis Longus Hoffmann Reflex and Postural Instability Associated With Chronic Ankle Instability

    Science.gov (United States)

    Kim, Kyung-Min; Hart, Joseph M.; Saliba, Susan A.; Hertel, Jay

    2016-01-01

    Context: Individuals with chronic ankle instability (CAI) present with decreased modulation of the Hoffmann reflex (H-reflex) from a simple to a more challenging task. The neural alteration is associated with impaired postural control, but the relationship has not been investigated in individuals with CAI. Objective: To determine differences in H-reflex modulation and postural control between individuals with or without CAI and to identify if they are correlated in individuals with CAI. Design: Descriptive laboratory study. Setting: Laboratory. Patients or Other Participants: A total of 15 volunteers with CAI (9 males, 6 females; age = 22.6 ± 5.8 years, height = 174.7 ± 8.1 cm, mass = 74.9 ± 12.8 kg) and 15 healthy sex-matched volunteers serving as controls (9 males, 6 females; age = 23.8 ± 5.8 years, height = 171.9 ± 9.9 cm, mass = 68.9 ± 15.5 kg) participated. Intervention(s): Maximum H-reflex (Hmax) and motor wave (Mmax) from the soleus and fibularis longus were recorded while participants lay prone and then stood in unipedal stance. We assessed postural tasks of unipedal stance with participants' eyes closed for 10 seconds using a forceplate. Main Outcome Measure(s): We normalized Hmax to Mmax to obtain Hmax : Mmax ratios for the 2 positions. For each muscle, H-reflex modulation was quantified using the percentage change scores in Hmax : Mmax ratios calculated from prone position to unipedal stance. Center-of-pressure data were used to compute 4 time-to-boundary variables. Separate independent-samples t tests were performed to determine group differences. Pearson product moment correlation coefficients were calculated between the modulation and balance measures in the CAI group. Results: The CAI group presented less H-reflex modulation in the soleus (t26 = −3.77, P = .001) and fibularis longus (t25 = −2.59, P = .02). The mean of the time-to-boundary minima in the anteroposterior direction was lower in the CAI group (t28 = −2.06, P = .048

  15. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  16. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  17. Computational design analysis for deployment of cardiovascular stents

    International Nuclear Information System (INIS)

    Tammareddi, Sriram; Sun Guangyong; Li Qing

    2010-01-01

    Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.

  18. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  19. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  20. Computer assessment of interview data using latent semantic analysis.

    Science.gov (United States)

    Dam, Gregory; Kaufmann, Stefan

    2008-02-01

    Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed.

  1. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    Science.gov (United States)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-09-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  3. AGS - The ISR computer program for synchrotron design, orbit analysis and insertion matching

    International Nuclear Information System (INIS)

    Keil, E.; Marti, Y.; Montague, B.W.; Sudboe, A.

    1975-01-01

    This is a detailed guide to the use of the current version of a FORTRAN program for carrying out computations required in the design or modification of alternating-gradient synchrotrons and storage rings. The program, which runs on the CDC 7600 computer at CERN, computes linear transformation functions, and modifications of parameters to achieve specified properties; it tracks sets of particle trajectories, finds closed orbits when elements of the structure are displaced, computes the equilibrium orbit, designs closed-orbit bumps, tracks betatron functions through the structure, and matches insertions in the structure to specified betatron and dispersion functions. The report supersedes CERN 69-5 (AGS - The ISR computer system for synchrotron design and orbit analysis, by E. Keil and P. Strolin). (Author)

  4. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    -groups is that, the performance of the entire process can be evaluated from the contributions of the individual process-groups towards the selected flowsheet property (for example, energy consumed). The developed flowsheet property models include energy consumption, carbon footprint, product recovery, product......Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...

  5. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  6. ADAM: analysis of discrete models of biological systems using computer algebra.

    Science.gov (United States)

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web

  7. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1990-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems

  8. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  9. Implementation and evaluation of nonparametric regression procedures for sensitivity analysis of computationally demanding models

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Swiler, Laura P.; Helton, Jon C.; Sallaberry, Cedric J.

    2009-01-01

    The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.

  10. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    Science.gov (United States)

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (ptest-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  12. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  13. Analysis on the security of cloud computing

    Science.gov (United States)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  14. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  15. ESP and NOAH: computer programs for flood-risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Montague, D.F.; Rooney, J.J.; Fussell, J.B.; Baker, L.S.

    1982-06-01

    This report describes a computer program package that aids in assessing the impact of floods on risk from nuclear power plants. The package consists of two distinct computer programs: ESP and NOAH. The ESP program improves the efficiency of a flood analysis by screening accident sequences and identifying accident sequences that are potentially significant contributors to risk in the event of a flood. Input to ESP includes accident sequences from an existing risk assessment and flood screening criteria. The NOAH program provides detailed qualitative analysis of the plant systems identified by ESP. NOAH performs a qualitative flood simulation of the fault tree

  16. Climate Change Discourse in Mass Media: Application of Computer-Assisted Content Analysis

    Science.gov (United States)

    Kirilenko, Andrei P.; Stepchenkova, Svetlana O.

    2012-01-01

    Content analysis of mass media publications has become a major scientific method used to analyze public discourse on climate change. We propose a computer-assisted content analysis method to extract prevalent themes and analyze discourse changes over an extended period in an objective and quantifiable manner. The method includes the following: (1)…

  17. Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis

    NARCIS (Netherlands)

    van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria

    2015-01-01

    In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results

  18. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  19. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given

  20. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.

  1. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  2. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  3. Nuclear analysis via internet based on robocom, a robotic computing environment

    International Nuclear Information System (INIS)

    Kim, J.H.; Liu, J.; Foung, R.

    1997-01-01

    Nuclear engineers can resume the pioneering role the authors once played in network computing. The authors can transform Internet World Wide Web from offering information to offering analysis capabilities in the form of software robots that are capable of processing information of their own. Pooling such transmittable robots by the nuclear analysis community would represent a collaborative effort to automate nuclear engineering analysis. Samples of HTML files for a web site, a robot and its corresponding schematics are included to demonstrate the ease of implementation

  4. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  5. PSYCHE: An Object-Oriented Approach to Simulating Medical Education

    Science.gov (United States)

    Mullen, Jamie A.

    1990-01-01

    Traditional approaches to computer-assisted instruction (CAI) do not provide realistic simulations of medical education, in part because they do not utilize heterogeneous knowledge bases for their source of domain knowledge. PSYCHE, a CAI program designed to teach hypothetico-deductive psychiatric decision-making to medical students, uses an object-oriented implementation of an intelligent tutoring system (ITS) to model the student, domain expert, and tutor. It models the transactions between the participants in complex transaction chains, and uses heterogeneous knowledge bases to represent both domain and procedural knowledge in clinical medicine. This object-oriented approach is a flexible and dynamic approach to modeling, and represents a potentially valuable tool for the investigation of medical education and decision-making.

  6. Analysis of plutonium gamma-ray spectra by small portable computers

    International Nuclear Information System (INIS)

    Ruhter, W.; Gunnink, R.; Camp, D.; DeCarolis, M.

    1985-01-01

    A sophisticated program for isotopic analysis of plutonium gamma-ray spectra using small computers has been developed. It is implemented on a DEC LSI-11/2 configured in a portable unit without a mass storage device for use by IAEA inspectors in the field. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are needed as input. Analysis is completed in 90 seconds by fitting isotopic component response functions to peak multiplets. 9 refs., 2 figs., 1 tab

  7. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    Science.gov (United States)

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  8. SciServer Compute brings Analysis to Big Data in the Cloud

    Science.gov (United States)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  9. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    Science.gov (United States)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  10. On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.

    Science.gov (United States)

    Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N

    2016-04-01

    An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.

  11. Blended Learning: An Innovative Approach

    Science.gov (United States)

    Lalima; Dangwal, Kiran Lata

    2017-01-01

    Blended learning is an innovative concept that embraces the advantages of both traditional teaching in the classroom and ICT supported learning including both offline learning and online learning. It has scope for collaborative learning; constructive learning and computer assisted learning (CAI). Blended learning needs rigorous efforts, right…

  12. Using Virtual Reality with and without Gaming Attributes for Academic Achievement

    Science.gov (United States)

    Vogel, Jennifer J.; Greenwood-Ericksen, Adams; Cannon-Bowers, Jan; Bowers, Clint A.

    2006-01-01

    A subcategory of computer-assisted instruction (CAI), games have additional attributes such as motivation, reward, interactivity, score, and challenge. This study used a quasi-experimental design to determine if previous findings generalize to non simulation-based game designs. Researchers observed significant improvement in the overall population…

  13. Cepstrum analysis and applications to computational fluid dynamic solutions

    Science.gov (United States)

    Meadows, Kristine R.

    1990-04-01

    A novel approach to the problem of spurious reflections introduced by artificial boundary conditions in computational fluid dynamic (CFD) solutions is proposed. Instead of attempting to derive non-reflecting boundary conditions, the approach is to accept the fact that spurious reflections occur, but to remove these reflections with cepstrum analysis, a signal processing technique which has been successfully used to remove echoes from experimental data. First, the theory of the cepstrum method is presented. This includes presentation of two types of cepstra: The Power Cepstrum and the Complex Cepstrum. The definitions of the cepstrum methods are applied theoretically and numerically to the analytical solution of sinusoidal plane wave propagation in a duct. One-D and 3-D time dependent solutions to the Euler equations are computed, and hard-wall conditions are prescribed at the numerical boundaries. The cepstrum method is applied, and the reflections from the boundaries are removed from the solutions. One-D and 3-D solutions are computed with so called nonreflecting boundary conditions, and these solutions are compared to those obtained by prescribing hard wall conditions and processing with the cepstrum.

  14. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  15. A review of small canned computer programs for survey research and demographic analysis.

    Science.gov (United States)

    Sinquefield, J C

    1976-12-01

    A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.

  16. Analysis of molten salt thermal-hydraulics using computational fluid dynamics

    International Nuclear Information System (INIS)

    Yamaji, B.; Csom, G.; Aszodi, A.

    2003-01-01

    To give a good solution for the problem of high level radioactive waste partitioning and transmutation is expected to be a pro missing option. Application of this technology also could extend the possibilities of nuclear energy. Large number of liquid-fuelled reactor concepts or accelerator driven subcritical systems was proposed as transmutors. Several of these consider fluoride based molten salts as the liquid fuel and coolant medium. The thermal-hydraulic behaviour of these systems is expected to be fundamentally different than the behaviour of widely used water-cooled reactors with solid fuel. Considering large flow domains three-dimensional thermal-hydraulic analysis is the method seeming to be applicable. Since the fuel is the coolant medium as well, one can expect a strong coupling between neutronics and thermal-hydraulics too. In the present paper the application of Computational Fluid Dynamics for three-dimensional thermal-hydraulics simulations of molten salt reactor concepts is introduced. In our past and recent works several calculations were carried out to investigate the capabilities of Computational Fluid Dynamics through the analysis of different molten salt reactor concepts. Homogenous single region molten salt reactor concept is studied and optimised. Another single region reactor concept is introduced also. This concept has internal heat exchanges in the flow domain and the molten salt is circulated by natural convection. The analysis of the MSRE experiment is also a part of our work since it may form a good background from the validation point of view. In the paper the results of the Computational Fluid Dynamics calculations with these concepts are presented. In the further work our objective is to investigate the thermal-hydraulics of the multi-region molten salt reactor (Authors)

  17. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  18. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  19. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  20. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...