WorldWideScience

Sample records for computational analyses suggest

  1. Proteomics computational analyses suggest that the bornavirus glycoprotein is a class III viral fusion protein (γ penetrene

    Directory of Open Access Journals (Sweden)

    Garry Robert F

    2009-09-01

    Full Text Available Abstract Background Borna disease virus (BDV is the type member of the Bornaviridae, a family of viruses that induce often fatal neurological diseases in horses, sheep and other animals, and have been proposed to have roles in certain psychiatric diseases of humans. The BDV glycoprotein (G is an extensively glycosylated protein that migrates with an apparent molecular mass of 84,000 to 94,000 kilodaltons (kDa. BDV G is post-translationally cleaved by the cellular subtilisin-like protease furin into two subunits, a 41 kDa amino terminal protein GP1 and a 43 kDa carboxyl terminal protein GP2. Results Class III viral fusion proteins (VFP encoded by members of the Rhabdoviridae, Herpesviridae and Baculoviridae have an internal fusion domain comprised of beta sheets, other beta sheet domains, an extended alpha helical domain, a membrane proximal stem domain and a carboxyl terminal anchor. Proteomics computational analyses suggest that the structural/functional motifs that characterize class III VFP are located collinearly in BDV G. Structural models were established for BDV G based on the post-fusion structure of a prototypic class III VFP, vesicular stomatitis virus glycoprotein (VSV G. Conclusion These results suggest that G encoded by members of the Bornavirdae are class III VFPs (gamma-penetrenes.

  2. The role of CFD computer analyses in hydrogen safety management

    International Nuclear Information System (INIS)

    Komen, E.M.J; Visser, D.C; Roelofs, F.; Te Lintelo, J.G.T

    2014-01-01

    The risks of hydrogen release and combustion during a severe accident in a light water reactor have attracted considerable attention after the Fukushima accident in Japan. Reliable computer analyses are needed for the optimal design of hydrogen mitigation systems, like e.g. passive autocatalytic recombiners (PARs), and for the assessment of the associated residual risk of hydrogen combustion. Traditionally, so-called Lumped Parameter (LP) computer codes are being used for these purposes. In the last decade, significant progress has been made in the development, validation, and application of more detailed, three-dimensional Computational Fluid Dynamics (CFD) simulations for hydrogen safety analyses. The objective of the current paper is to address the following questions: - When are CFD computer analyses needed complementary to the traditional LP code analyses for hydrogen safety management? - What is the validation status of the CFD computer code for hydrogen distribution, mitigation, and combustion analyses? - Can CFD computer analyses nowadays be executed in practical and reliable way for full scale containments? The validation status and reliability of CFD code simulations will be illustrated by validation analyses performed for experiments executed in the PANDA, THAI, and ENACCEF facilities. (authors)

  3. Computer- and Suggestion-based Cognitive Rehabilitation following Acquired Brain Injury

    DEFF Research Database (Denmark)

    Lindeløv, Jonas Kristoffer

    . That is, training does not cause cognitive transfer and thus does not constitute “brain training” or “brain exercise” of any clinical relevance. A larger study found more promising results for a suggestion-based treatment in a hypnotic procedure. Patients improved to above population average in a matter...... of 4-8 hours, making this by far the most effective treatment compared to computer-based training, physical exercise, phamaceuticals, meditation, and attention process training. The contrast between computer-based methods and the hypnotic suggestion treatment may be reflect a more general discrepancy...

  4. Data Processing: Fifteen Suggestions for Computer Training in Your Business Education Classes.

    Science.gov (United States)

    Barr, Lowell L.

    1980-01-01

    Presents 15 suggestions for training business education students in the use of computers. Suggestions involve computer language, method of presentation, laboratory time, programing assignments, instructions and handouts, problem solving, deadlines, reviews, programming concepts, programming logic, documentation, and defensive programming. (CT)

  5. A computer program for multiple decrement life table analyses.

    Science.gov (United States)

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  6. Development of the computer code system for the analyses of PWR core

    International Nuclear Information System (INIS)

    Tsujimoto, Iwao; Naito, Yoshitaka.

    1992-11-01

    This report is one of the materials for the work titled 'Development of the computer code system for the analyses of PWR core phenomena', which is performed under contracts between Shikoku Electric Power Company and JAERI. In this report, the numerical method adopted in our computer code system are described, that is, 'The basic course and the summary of the analysing method', 'Numerical method for solving the Boltzmann equation', 'Numerical method for solving the thermo-hydraulic equations' and 'Description on the computer code system'. (author)

  7. Computer-based medical education in Benha University, Egypt: knowledge, attitude, limitations, and suggestions.

    Science.gov (United States)

    Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A

    2016-12-01

    Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (Pcomputer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (Pcomputers limited the inclusion of computer in medical education (Pcomputer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (Pcomputer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical

  8. Extending and Applying Spartan to Perform Temporal Sensitivity Analyses for Predicting Changes in Influential Biological Pathways in Computational Models.

    Science.gov (United States)

    Alden, Kieran; Timmis, Jon; Andrews, Paul S; Veiga-Fernandes, Henrique; Coles, Mark

    2017-01-01

    Through integrating real time imaging, computational modelling, and statistical analysis approaches, previous work has suggested that the induction of and response to cell adhesion factors is the key initiating pathway in early lymphoid tissue development, in contrast to the previously accepted view that the process is triggered by chemokine mediated cell recruitment. These model derived hypotheses were developed using spartan, an open-source sensitivity analysis toolkit designed to establish and understand the relationship between a computational model and the biological system that model captures. Here, we extend the functionality available in spartan to permit the production of statistical analyses that contrast the behavior exhibited by a computational model at various simulated time-points, enabling a temporal analysis that could suggest whether the influence of biological mechanisms changes over time. We exemplify this extended functionality by using the computational model of lymphoid tissue development as a time-lapse tool. By generating results at twelve- hour intervals, we show how the extensions to spartan have been used to suggest that lymphoid tissue development could be biphasic, and predict the time-point when a switch in the influence of biological mechanisms might occur.

  9. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    Science.gov (United States)

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-06-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening.

  10. Dispensing processes impact apparent biological activity as determined by computational and statistical analyses.

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.

  11. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  12. Different perspectives on the use of personal computers for technical analyses

    International Nuclear Information System (INIS)

    Libby, R.A.; Doherty, A.L.

    1986-01-01

    Personal computers (PCs) have widespread availability and use in many technical environments. The machines may have initially been justified for use as word processors or for data base management, but many technical applications are being performed and often the computer codes used in these technical analyses have been moved from large mainframe machines. The general feeling in the user community is that the free computer time on these machines justifies moving as many applications as possible from the large computer systems. Many of these PC applications cannot be justified if the total cost of using microcomputers is considered. A Hanford-wide local area network (LAN) is being established which allows individual PCs to be used as terminals to connect to mainframe computers at high data transfer rates (9600 baud). This system allows fast, easy connection to a variety of different computers with a few keystrokes. The LAN eliminates the problem of low-speed communication with mainframe computers and makes operation on the mainframes as simple as operation on the host PC, itself

  13. Analysing the doctor_patient_computer relationship: the use of video data

    Directory of Open Access Journals (Sweden)

    Christopher Pearce

    2006-12-01

    Full Text Available This paper examines the utility of using digital video data in observational studies involving doctors' and patients' use of computers in the consultation. Previous observational studies have used either direct observations or analogue videotapes. We describe a method currently in use in a study examining how doctors, patients and computers interact in the consultation. The study is set in general practice as this is the most clinically computerised section of the Australian healthcare system. Computers are now used for clinical functions in 90% of doctors' surgeries. With this rapid rise of computerisation, concerns have been expressed as to how the computer will affect the doctor_patient relationship. To assess how doctors, patients and computers interact, we have chosen an observational technique, namely to make digital videotapes of actual consultations. This analysis is based on a theoretical framework derived from dramaturgical analysis. Data are gathered from general practitioners who are high-level users of computers, as defined by their use of progress notes, as well as prescribing and test ordering. The subsequent digital data is then transferred onto computer and analysed according to our conceptual framework, making use of video-tagging software.

  14. Advanced computational tools and methods for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    Fischer, U.; Chen, Y.; Pereslavtsev, P.; Simakov, S.P.; Tsige-Tamirat, H.; Loughlin, M.; Perel, R.L.; Petrizzi, L.; Tautges, T.J.; Wilson, P.P.H.

    2005-01-01

    An overview is presented of advanced computational tools and methods developed recently for nuclear analyses of Fusion Technology systems such as the experimental device ITER ('International Thermonuclear Experimental Reactor') and the intense neutron source IFMIF ('International Fusion Material Irradiation Facility'). These include Monte Carlo based computational schemes for the calculation of three-dimensional shut-down dose rate distributions, methods, codes and interfaces for the use of CAD geometry models in Monte Carlo transport calculations, algorithms for Monte Carlo based sensitivity/uncertainty calculations, as well as computational techniques and data for IFMIF neutronics and activation calculations. (author)

  15. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  16. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  17. Computer Learner Corpora: Analysing Interlanguage Errors in Synchronous and Asynchronous Communication

    Science.gov (United States)

    MacDonald, Penny; Garcia-Carbonell, Amparo; Carot, Sierra, Jose Miguel

    2013-01-01

    This study focuses on the computer-aided analysis of interlanguage errors made by the participants in the telematic simulation IDEELS (Intercultural Dynamics in European Education through on-Line Simulation). The synchronous and asynchronous communication analysed was part of the MiLC Corpus, a multilingual learner corpus of texts written by…

  18. Computational and experimental analyses of the wave propagation through a bar structure including liquid-solid interface

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jin [UST Graduate School, Daejeon (Korea, Republic of); Rhee, Hui Nam [Division of Mechanical and Aerospace Engineering, Sunchon National University, Sunchon (Korea, Republic of); Yoon, Doo Byung; Park, Jin Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    In this research, we study the propagation of longitudinal and transverse waves through a metal rod including a liquid layer using computational and experimental analyses. The propagation characteristics of longitudinal and transverse waves obtained by the computational and experimental analyses were consistent with the wave propagation theory for both cases, that is, the homogeneous metal rod and the metal rod including a liquid layer. The fluid-structure interaction modeling technique developed for the computational wave propagation analysis in this research can be applied to the more complex structures including solid-liquid interfaces.

  19. Development of a system of computer codes for severe accident analyses and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1991-12-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy.

  20. Development of a system of computer codes for severe accident analyses and its applications

    International Nuclear Information System (INIS)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan

    1991-12-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy

  1. Proteomics computational analyses suggest that baculovirus GP64 superfamily proteins are class III penetrenes

    Directory of Open Access Journals (Sweden)

    Garry Robert F

    2008-02-01

    Full Text Available Abstract Background Members of the Baculoviridae encode two types of proteins that mediate virus:cell membrane fusion and penetration into the host cell. Alignments of primary amino acid sequences indicate that baculovirus fusion proteins of group I nucleopolyhedroviruses (NPV form the GP64 superfamily. The structure of these viral penetrenes has not been determined. The GP64 superfamily includes the glycoprotein (GP encoded by members of the Thogotovirus genus of the Orthomyxoviridae. The entry proteins of other baculoviruses, group II NPV and granuloviruses, are class I penetrenes. Results Class III penetrenes encoded by members of the Rhabdoviridae and Herpesviridae have an internal fusion domain comprised of beta sheets, other beta sheet domains, an extended alpha helical domain, a membrane proximal stem domain and a carboxyl terminal anchor. Similar sequences and structural/functional motifs that characterize class III penetrenes are located collinearly in GP64 of group I baculoviruses and related glycoproteins encoded by thogotoviruses. Structural models based on a prototypic class III penetrene, vesicular stomatitis virus glycoprotein (VSV G, were established for Thogoto virus (THOV GP and Autographa california multiple NPV (AcMNPV GP64 demonstrating feasible cysteine linkages. Glycosylation sites in THOV GP and AcMNPV GP64 appear in similar model locations to the two glycosylation sites of VSV G. Conclusion These results suggest that proteins in the GP64 superfamily are class III penetrenes.

  2. Representational constraints on children's suggestibility.

    Science.gov (United States)

    Ceci, Stephen J; Papierno, Paul B; Kulkofsky, Sarah

    2007-06-01

    In a multistage experiment, twelve 4- and 9-year-old children participated in a triad rating task. Their ratings were mapped with multidimensional scaling, from which euclidean distances were computed to operationalize semantic distance between items in target pairs. These children and age-mates then participated in an experiment that employed these target pairs in a story, which was followed by a misinformation manipulation. Analyses linked individual and developmental differences in suggestibility to children's representations of the target items. Semantic proximity was a strong predictor of differences in suggestibility: The closer a suggested distractor was to the original item's representation, the greater was the distractor's suggestive influence. The triad participants' semantic proximity subsequently served as the basis for correctly predicting memory performance in the larger group. Semantic proximity enabled a priori counterintuitive predictions of reverse age-related trends to be confirmed whenever the distance between representations of items in a target pair was greater for younger than for older children.

  3. A review of experiments and computer analyses on RIAs

    International Nuclear Information System (INIS)

    Jernkvist, L.O.; Massih, A.R.; In de Betou, J.

    2010-01-01

    Reactivity initiated accidents (RIAs) are nuclear reactor accidents that involve an unwanted increase in fission rate and reactor power. Reactivity initiated accidents in power reactors may occur as a result of reactor control system failures, control element ejections or events caused by rapid changes in temperature or pressure of the coolant/moderator. our current understanding of reactivity initiated accidents and their consequences is based largely on three sources of information: 1) best-estimate computer analyses of the reactor response to postulated accident scenarios, 2) pulse-irradiation tests on instrumented fuel rodlets, carried out in research reactors, 3) out-of-pile separate effect tests, targeted to explore key phenomena under RIA conditions. In recent years, we have reviewed, compiled and analysed these 3 categories of data. The results is a state-of-the-art report on fuel behaviour under RIA conditions, which is currently being published by the OECD Nuclear Energy Agency. The purpose of this paper is to give a brief summary of this report

  4. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  5. The Incorporation of Micro-Computer Technology into School Mathematics: Some Suggestions for Middle and Senior Mathematics Courses.

    Science.gov (United States)

    Newton, Bill

    1987-01-01

    Argues that the use of computer technologies in secondary schools should change the nature of mathematics education. Urges the rethinking of the uses of traditional paper-and-pencil computations. Suggests some computer applications for elementary algebra and for problem solving in arithmetic and calculus. (TW)

  6. Insulin mimetics in Urtica dioica: structural and computational analyses of Urtica dioica extracts.

    Science.gov (United States)

    Domola, Masoud Shabani; Vu, Vivian; Robson-Doucette, Christine A; Sweeney, Gary; Wheeler, Michael B

    2010-06-01

    Urtica Dioica (UD) is a plant shown to reduce blood glucose levels upon oral ingestion; however, neither its active component nor its mechanism of action has been identified. One active fraction of this extract, termed UD-1, was separated by molecular sieve column chromatography and purified by high performance liquid chromatography (HPLC). While UD-1 did not stimulate insulin secretion in glucose-responsive MIN6 clonal beta-cells, chronic exposure (24 h) significantly enhanced glucose uptake (approximately 1.5-fold) in L6-GLUT4myc myoblast cells. Using HPLC and MALDI-TOF, we further purified the UD-1 fraction into two fractions termed UD-1A and UD-1B. Computational and structural analyses strongly suggested that the antidiabetic component of UD-1 was due to one or more structurally related cyclical peptides that facilitate glucose uptake by forming unique glucose permeable pores. The structure and function of these glucose-conducting pores are discussed herein.

  7. Types of suggestibility: Relationships among compliance, indirect, and direct suggestibility.

    Science.gov (United States)

    Polczyk, Romuald; Pasek, Tomasz

    2006-10-01

    It is commonly believed that direct suggestibility, referring to overt influence, and indirect suggestibility, in which the intention to influence is hidden, correlate poorly. This study demonstrates that they are substantially related, provided that they tap similar areas of influence. Test results from 103 students, 55 women and 48 men, were entered into regression analyses. Indirect suggestibility, as measured by the Sensory Suggestibility Scale for Groups, and compliance, measured by the Gudjonsson Compliance Scale, were predictors of direct suggestibility, assessed with the Barber Suggestibility Scale. Spectral analyses showed that indirect suggestibility is more related to difficult tasks on the BSS, but compliance is more related to easy tasks on this scale.

  8. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  9. Experimental and Computational Modal Analyses for Launch Vehicle Models considering Liquid Propellant and Flange Joints

    Directory of Open Access Journals (Sweden)

    Chang-Hoon Sim

    2018-01-01

    Full Text Available In this research, modal tests and analyses are performed for a simplified and scaled first-stage model of a space launch vehicle using liquid propellant. This study aims to establish finite element modeling techniques for computational modal analyses by considering the liquid propellant and flange joints of launch vehicles. The modal tests measure the natural frequencies and mode shapes in the first and second lateral bending modes. As the liquid filling ratio increases, the measured frequencies decrease. In addition, as the number of flange joints increases, the measured natural frequencies increase. Computational modal analyses using the finite element method are conducted. The liquid is modeled by the virtual mass method, and the flange joints are modeled using one-dimensional spring elements along with the node-to-node connection. Comparison of the modal test results and predicted natural frequencies shows good or moderate agreement. The correlation between the modal tests and analyses establishes finite element modeling techniques for modeling the liquid propellant and flange joints of space launch vehicles.

  10. Functional and Expression Analyses of the Pneumocystis MAT Genes Suggest Obligate Sexuality through Primary Homothallism within Host Lungs

    Directory of Open Access Journals (Sweden)

    S. Richard

    2018-02-01

    Full Text Available Fungi of the genus Pneumocystis are obligate parasites that colonize mammals’ lungs and are host species specific. Pneumocystis jirovecii and Pneumocystis carinii infect, respectively, humans and rats. They can turn into opportunistic pathogens in immunosuppressed hosts, causing severe pneumonia. Their cell cycle is poorly known, mainly because of the absence of an established method of culture in vitro. It is thought to include both asexual and sexual phases. Comparative genomic analysis suggested that their mode of sexual reproduction is primary homothallism involving a single mating type (MAT locus encompassing plus and minus genes (matMc, matMi, and matPi; Almeida et al., mBio 6:e02250-14, 2015. Thus, each strain would be capable of sexual reproduction alone (self-fertility. However, this is a working hypothesis derived from computational analyses that is, in addition, based on the genome sequences of single isolates. Here, we tested this hypothesis in the wet laboratory. The function of the P. jirovecii and P. carinii matMc genes was ascertained by restoration of sporulation in the corresponding mutant of fission yeast. Using PCR, we found the same single MAT locus in all P. jirovecii isolates and showed that all three MAT genes are often concomitantly expressed during pneumonia. Extensive homology searches did not identify other types of MAT transcription factors in the genomes or cis-acting motifs flanking the MAT locus that could have been involved in MAT switching or silencing. Our observations suggest that Pneumocystis sexuality through primary homothallism is obligate within host lungs to complete the cell cycle, i.e., produce asci necessary for airborne transmission to new hosts.

  11. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    Science.gov (United States)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  12. Computer analyses for the design, operation and safety of new isotope production reactors: A technology status review

    International Nuclear Information System (INIS)

    Wulff, W.

    1990-01-01

    A review is presented on the currently available technologies for nuclear reactor analyses by computer. The important distinction is made between traditional computer calculation and advanced computer simulation. Simulation needs are defined to support the design, operation, maintenance and safety of isotope production reactors. Existing methods of computer analyses are categorized in accordance with the type of computer involved in their execution: micro, mini, mainframe and supercomputers. Both general and special-purpose computers are discussed. Major computer codes are described, with regard for their use in analyzing isotope production reactors. It has been determined in this review that conventional systems codes (TRAC, RELAP5, RETRAN, etc.) cannot meet four essential conditions for viable reactor simulation: simulation fidelity, on-line interactive operation with convenient graphics, high simulation speed, and at low cost. These conditions can be met by special-purpose computers (such as the AD100 of ADI), which are specifically designed for high-speed simulation of complex systems. The greatest shortcoming of existing systems codes (TRAC, RELAP5) is their mismatch between very high computational efforts and low simulation fidelity. The drift flux formulation (HIPA) is the viable alternative to the complicated two-fluid model. No existing computer code has the capability of accommodating all important processes in the core geometry of isotope production reactors. Experiments are needed (heat transfer measurements) to provide necessary correlations. It is important for the nuclear community, both in government, industry and universities, to begin to take advantage of modern simulation technologies and equipment. 41 refs

  13. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  14. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  15. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Directory of Open Access Journals (Sweden)

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  16. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  17. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  18. L'ordinateur et l'analyse grammaticale (The Computer and Grammatical Analysis). Series B-2.

    Science.gov (United States)

    Mepham, Michael S.

    This discussion of the use of computer programming in syntactic analysis covers three major points: (1) a review of basic notions in automatic grammars; (2) a description of the grammar used in a pilot project which analysed the linguistic content of methods of teaching foreign languages; and (3) proposals on the application of the same techniques…

  19. Computational transport phenomena for engineering analyses

    CERN Document Server

    Farmer, Richard C; Cheng, Gary C; Chen, Yen-Sen

    2009-01-01

    Computational Transport PhenomenaOverviewTransport PhenomenaAnalyzing Transport PhenomenaA Computational Tool: The CTP CodeVerification, Validation, and GeneralizationSummaryNomenclatureReferencesThe Equations of ChangeIntroductionDerivation of The Continuity EquationDerivation of The Species Continuity EquationDerivation of The Equation Of MotionDerivation of The General Energy EquationNon-Newtonian FluidsGeneral Property BalanceAnalytical and Approximate Solutions for the Equations of ChangeSummaryNomenclatureReferencesPhysical PropertiesOverviewReal-Fluid ThermodynamicsChemical Equilibrium

  20. Selective insectivory at Toro-Semliki, Uganda: comparative analyses suggest no 'savanna' chimpanzee pattern.

    Science.gov (United States)

    Webster, Timothy H; McGrew, William C; Marchant, Linda F; Payne, Charlotte L R; Hunt, Kevin D

    2014-06-01

    Chimpanzee (Pan troglodytes) insectivory across Africa is ubiquitous. Insects provide a significant nutritional payoff and may be important for chimpanzees in dry, open habitats with narrow diets. We tested this hypothesis at Semliki, Uganda, a long-term dry study site. We evaluated prospects for insectivory by measuring insect abundance along de novo transects and trails, monitoring social insect colonies, and surveying available raw materials for elementary technology. We determined the frequency and nature of insectivory through behavioral observation and fecal analysis. We then compared our results with those from 15 other long-term chimpanzee study sites using a cluster analysis. We found that Semliki chimpanzees are one of the most insectivorous populations studied to date in terms of frequency of consumption, but they are very selective in their insectivory, regularly consuming only weaver ants (Oecophylla longinoda) and honey and bees from hives of Apis mellifera. This selectivity obtains despite having a full range of typical prey species available in harvestable quantities. We suggest that Semliki chimpanzees may face ecological time constraints and therefore bias their predation toward prey taxa that can be quickly consumed. Geographical proximity correlated with the results of the cluster analysis, while rainfall, a relatively gross measure of environment, did not. Because broad taxonomic groups of insects were used in analyses, prey availability was unlikely to have a strong effect on this pattern. Instead, we suggest that transmission of cultural knowledge may play a role in determining chimpanzee prey selection across Africa. Further study is needed to test these hypotheses. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  2. Computational methods, tools and data for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    Fischer, U.

    2006-01-01

    An overview is presented of the Research and Development work conducted at Forschungszentrum Karlsruhe in co-operation with other associations in the framework of the European Fusion Technology Programme on the development and qualification of computational tools and data for nuclear analyses of Fusion Technology systems. The focus is on the development of advanced methods and tools based on the Monte Carlo technique for particle transport simulations, and the evaluation and qualification of dedicated nuclear data to satisfy the needs of the ITER and the IFMIF projects. (author)

  3. Using computers in early years education: What are the effects on children’s development? Some suggestions concerning beneficial computer practice

    OpenAIRE

    Theodotou, Evgenia

    2010-01-01

    Technology in education is considered in empirical and theoretical literature as both beneficial and harmful to children’s development. In the field of the early years settings there is a dilemma whether or not early childhood teachers should use technology as a teaching and learning resource. This paper has a pedagogical focus, discussing the advantages and the potential problems of computer practice to children’s learning and behaviour in the early years settings and also suggests teaching ...

  4. Three-dimensional models of Mycobacterium tuberculosis proteins Rv1555, Rv1554 and their docking analyses with sildenafil, tadalafil, vardenafil drugs, suggest interference with quinol binding likely to affect protein's function.

    Science.gov (United States)

    Dash, Pallabini; Bala Divya, M; Guruprasad, Lalitha; Guruprasad, Kunchur

    2018-04-18

    Earlier based on bioinformatics analyses, we had predicted the Mycobacterium tuberculosis (M.tb) proteins; Rv1555 and Rv1554, among the potential new tuberculosis drug targets. According to the 'TB-drugome' the Rv1555 protein is 'druggable' with sildenafil (Viagra), tadalafil (Cialis) and vardenafil (Levitra) drugs. In the present work, we intended to understand via computer modeling studies, how the above drugs are likely to inhibit the M.tb protein's function. The three-dimensional computer models for M.tb proteins; Rv1555 and Rv1554 constructed on the template of equivalent membrane anchor subunits of the homologous E.coli quinol fumarate reductase respiratory protein complex, followed by drug docking analyses, suggested that the binding of above drugs interferes with quinol binding sites. Also, we experimentally observed the in-vitro growth inhibition of E.coli bacteria containing the homologous M.tb protein sequences with sildenafil and tadalafil drugs. The predicted binding sites of the drugs is likely to affect the above M.tb proteins function as quinol binding is known to be essential for electron transfer function during anaerobic respiration in the homologous E.coli protein complex. Therefore, sildenafil and related drugs currently used in the treatment of male erectile dysfunction targeting the human phosphodiesterase 5 enzyme may be evaluated for their plausible role as repurposed drugs to treat human tuberculosis.

  5. Cross-disorder genome-wide analyses suggest a complex genetic relationship between Tourette's syndrome and OCD.

    Science.gov (United States)

    Yu, Dongmei; Mathews, Carol A; Scharf, Jeremiah M; Neale, Benjamin M; Davis, Lea K; Gamazon, Eric R; Derks, Eske M; Evans, Patrick; Edlund, Christopher K; Crane, Jacquelyn; Fagerness, Jesen A; Osiecki, Lisa; Gallagher, Patience; Gerber, Gloria; Haddad, Stephen; Illmann, Cornelia; McGrath, Lauren M; Mayerfeld, Catherine; Arepalli, Sampath; Barlassina, Cristina; Barr, Cathy L; Bellodi, Laura; Benarroch, Fortu; Berrió, Gabriel Bedoya; Bienvenu, O Joseph; Black, Donald W; Bloch, Michael H; Brentani, Helena; Bruun, Ruth D; Budman, Cathy L; Camarena, Beatriz; Campbell, Desmond D; Cappi, Carolina; Silgado, Julio C Cardona; Cavallini, Maria C; Chavira, Denise A; Chouinard, Sylvain; Cook, Edwin H; Cookson, M R; Coric, Vladimir; Cullen, Bernadette; Cusi, Daniele; Delorme, Richard; Denys, Damiaan; Dion, Yves; Eapen, Valsama; Egberts, Karin; Falkai, Peter; Fernandez, Thomas; Fournier, Eduardo; Garrido, Helena; Geller, Daniel; Gilbert, Donald L; Girard, Simon L; Grabe, Hans J; Grados, Marco A; Greenberg, Benjamin D; Gross-Tsur, Varda; Grünblatt, Edna; Hardy, John; Heiman, Gary A; Hemmings, Sian M J; Herrera, Luis D; Hezel, Dianne M; Hoekstra, Pieter J; Jankovic, Joseph; Kennedy, James L; King, Robert A; Konkashbaev, Anuar I; Kremeyer, Barbara; Kurlan, Roger; Lanzagorta, Nuria; Leboyer, Marion; Leckman, James F; Lennertz, Leonhard; Liu, Chunyu; Lochner, Christine; Lowe, Thomas L; Lupoli, Sara; Macciardi, Fabio; Maier, Wolfgang; Manunta, Paolo; Marconi, Maurizio; McCracken, James T; Mesa Restrepo, Sandra C; Moessner, Rainald; Moorjani, Priya; Morgan, Jubel; Muller, Heike; Murphy, Dennis L; Naarden, Allan L; Nurmi, Erika; Ochoa, William Cornejo; Ophoff, Roel A; Pakstis, Andrew J; Pato, Michele T; Pato, Carlos N; Piacentini, John; Pittenger, Christopher; Pollak, Yehuda; Rauch, Scott L; Renner, Tobias; Reus, Victor I; Richter, Margaret A; Riddle, Mark A; Robertson, Mary M; Romero, Roxana; Rosário, Maria C; Rosenberg, David; Ruhrmann, Stephan; Sabatti, Chiara; Salvi, Erika; Sampaio, Aline S; Samuels, Jack; Sandor, Paul; Service, Susan K; Sheppard, Brooke; Singer, Harvey S; Smit, Jan H; Stein, Dan J; Strengman, Eric; Tischfield, Jay A; Turiel, Maurizio; Valencia Duarte, Ana V; Vallada, Homero; Veenstra-VanderWeele, Jeremy; Walitza, Susanne; Wang, Ying; Weale, Mike; Weiss, Robert; Wendland, Jens R; Westenberg, Herman G M; Shugart, Yin Yao; Hounie, Ana G; Miguel, Euripedes C; Nicolini, Humberto; Wagner, Michael; Ruiz-Linares, Andres; Cath, Danielle C; McMahon, William; Posthuma, Danielle; Oostra, Ben A; Nestadt, Gerald; Rouleau, Guy A; Purcell, Shaun; Jenike, Michael A; Heutink, Peter; Hanna, Gregory L; Conti, David V; Arnold, Paul D; Freimer, Nelson B; Stewart, S Evelyn; Knowles, James A; Cox, Nancy J; Pauls, David L

    2015-01-01

    Obsessive-compulsive disorder (OCD) and Tourette's syndrome are highly heritable neurodevelopmental disorders that are thought to share genetic risk factors. However, the identification of definitive susceptibility genes for these etiologically complex disorders remains elusive. The authors report a combined genome-wide association study (GWAS) of Tourette's syndrome and OCD. The authors conducted a GWAS in 2,723 cases (1,310 with OCD, 834 with Tourette's syndrome, 579 with OCD plus Tourette's syndrome/chronic tics), 5,667 ancestry-matched controls, and 290 OCD parent-child trios. GWAS summary statistics were examined for enrichment of functional variants associated with gene expression levels in brain regions. Polygenic score analyses were conducted to investigate the genetic architecture within and across the two disorders. Although no individual single-nucleotide polymorphisms (SNPs) achieved genome-wide significance, the GWAS signals were enriched for SNPs strongly associated with variations in brain gene expression levels (expression quantitative loci, or eQTLs), suggesting the presence of true functional variants that contribute to risk of these disorders. Polygenic score analyses identified a significant polygenic component for OCD (p=2×10(-4)), predicting 3.2% of the phenotypic variance in an independent data set. In contrast, Tourette's syndrome had a smaller, nonsignificant polygenic component, predicting only 0.6% of the phenotypic variance (p=0.06). No significant polygenic signal was detected across the two disorders, although the sample is likely underpowered to detect a modest shared signal. Furthermore, the OCD polygenic signal was significantly attenuated when cases with both OCD and co-occurring Tourette's syndrome/chronic tics were included in the analysis (p=0.01). Previous work has shown that Tourette's syndrome and OCD have some degree of shared genetic variation. However, the data from this study suggest that there are also distinct

  6. Thyrolipomas – Prevalence in Computed Tomography and Suggestions for Pragmatic Management

    International Nuclear Information System (INIS)

    Gossner, Johannes

    2015-01-01

    Thyrolipomas seem to be a rare occurrence and until now their prevalence has only been reported in two studies. Because of the known significant geographic variations of thyroid disease, generalizability of these findings is uncertain as well as the management of found thyrolipomas. Retrospective study of 107 computed tomography scans of the chest of a European inpatient population. A literature review was performed and considerations for pragmatic management were proposed. Thyrolipomas were found in 2.8% of patients. All thyrolipomas were smaller than 15 mm. In all patients these were incidental findings unrelated to patients symptoms. No secondary signs of malignancy could be detected. Thyrolipomas are a common finding on cross sectional imaging. Like in this study they seem to be incidental findings lacking clinical relevance. Because of the rare possibility of an thyroid cancer with inclusion of mature fat, sonographic follow-up of incidentally discovered thyrolipomas larger than 15 mm should be suggested

  7. Evaluation of the computer code system RADHEAT-V4 by analysing benchmark problems on radiation shielding

    International Nuclear Information System (INIS)

    Sakamoto, Yukio; Naito, Yoshitaka

    1990-11-01

    A computer code system RADHEAT-V4 has been developed for safety evaluation on radiation shielding of nuclear fuel facilities. To evaluate the performance of the code system, 18 benchmark problem were selected and analysed. Evaluated radiations are neutron and gamma-ray. Benchmark problems consist of penetration, streaming and skyshine. The computed results show more accurate than those by the Sn codes ANISN and DOT3.5 or the Monte Carlo code MORSE. Big core memory and many times I/O are, however, required for RADHEAT-V4. (author)

  8. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  9. Introduction of the ASP3D Computer Program for Unsteady Aerodynamic and Aeroelastic Analyses

    Science.gov (United States)

    Batina, John T.

    2005-01-01

    A new computer program has been developed called ASP3D (Advanced Small Perturbation 3D), which solves the small perturbation potential flow equation in an advanced form including mass-consistent surface and trailing wake boundary conditions, and entropy, vorticity, and viscous effects. The purpose of the program is for unsteady aerodynamic and aeroelastic analyses, especially in the nonlinear transonic flight regime. The program exploits the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP3D code is the result of a decade of developmental work on improvements to the small perturbation formulation, performed while the author was employed as a Senior Research Scientist in the Configuration Aerodynamics Branch at the NASA Langley Research Center. The ASP3D code is a significant improvement to the state-of-the-art for transonic aeroelastic analyses over the CAP-TSD code (Computational Aeroelasticity Program Transonic Small Disturbance), which was developed principally by the author in the mid-1980s. The author is in a unique position as the developer of both computer programs to compare, contrast, and ultimately make conclusions regarding the underlying formulations and utility of each code. The paper describes the salient features of the ASP3D code including the rationale for improvements in comparison with CAP-TSD. Numerous results are presented to demonstrate the ASP3D capability. The general conclusion is that the new ASP3D capability is superior to the older CAP-TSD code because of the myriad improvements developed and incorporated.

  10. Imaging Features of Helical Computed Tomography Suggesting Advanced Urothelial Carcinoma Arising from the Pelvocalyceal System

    International Nuclear Information System (INIS)

    Kwak, Kyung Won; Park, Byung Kwan; Kim, Chan Kyo; Lee, Hyun Moo; Choi, Han Y ong

    2008-01-01

    Background: Urothelial carcinoma is the most common malignant tumor arising from the pelvocalyceal system. Helical computed tomography (CT) is probably the best preoperative-stage modality for the determination of treatment plan and prognosis. Purpose: To obtain helical CT imaging features suggesting advanced pelvocalyceal urothelial carcinoma. Material and Methods: Preoperative CT images in 44 patients with pelvocalyceal urothelial carcinoma were retrospectively reviewed and correlated with the pathological examination to determine imaging features suggesting stage III or IV of the disease. Results: Pathological stages revealed stage I in 16, stage II in three, stage III in 17, and stage IV in eight patients. Seven patients had metastatic lymph nodes. CT imaging showed that renal parenchymal invasion, sinus fat invasion, and lymph node metastasis were highly suggestive of advanced urothelial cell carcinoma (P<0.05). Helical CT sensitivity, specificity, and accuracy for advanced pelvocalyceal urothelial carcinoma were 76% (19/25), 84% (16/19), and 80% (35/44), respectively. Conclusion: Preoperative helical CT may suggest imaging features of advanced urothelial carcinoma, influencing treatment plan and patient prognosis, even though its accuracy is not so high

  11. Fractal analyses of osseous healing using Tuned Aperture Computed Tomography images

    International Nuclear Information System (INIS)

    Nair, M.K.; Nair, U.P.; Seyedain, A.; Webber, R.L.; Piesco, N.P.; Agarwal, S.; Mooney, M.P.; Groendahl, H.G.

    2001-01-01

    The aim of this study was to evaluate osseous healing in mandibular defects using fractal analyses on conventional radiographs and tuned aperture computed tomography (TACT; OrthoTACT, Instrumentarium Imaging, Helsinki, Finland) images. Eighty test sites on the inferior margins of rabbit mandibles were subject to lesion induction and treated with one of the following: no treatment (controls); osteoblasts only; polymer matrix only; or osteoblast-polymer matrix (OPM) combination. Images were acquired using conventional radiography and TACT, including unprocessed TACT (TACT-U) and iteratively restored TACT (TACT-IR). Healing was followed up over time and images acquired at 3, 6, 9, and 12 weeks post-surgery. Fractal dimension (FD) was computed within regions of interest in the defects using the TACT workbench. Results were analyzed for effects produced by imaging modality, treatment modality, time after surgery and lesion location. Histomorphometric data were available to assess ground truth. Significant differences (p<0.0001) were noted based on imaging modality with TACT-IR recording the highest mean fractal dimension (MFD), followed by TACT-U and conventional images, in that order. Sites treated with OPM recorded the highest MFDs among all treatment modalities (p<0.0001). The highest MFD based on time was recorded at 3 weeks and differed significantly with 12 weeks (p<0.035). Correlation of FD with results of histomorphometric data was high (r=0.79; p<0.001). The FD computed on TACT-IR showed the highest correlation with histomorphometric data, thus establishing the fact TACT is a more efficient and accurate imaging modality for quantification of osseous changes within healing bony defects. (orig.)

  12. Morphological analyses suggest a new taxonomic circumscription for Hymenaea courbaril L. (Leguminosae, Caesalpinioideae).

    Science.gov (United States)

    Souza, Isys Mascarenhas; Funch, Ligia Silveira; de Queiroz, Luciano Paganucci

    2014-01-01

    Hymenaea is a genus of the Resin-producing Clade of the tribe Detarieae (Leguminosae: Caesalpinioideae) with 14 species. Hymenaea courbaril is the most widespread species of the genus, ranging from southern Mexico to southeastern Brazil. As currently circumscribed, Hymenaea courbaril is a polytypic species with six varieties: var. altissima, var. courbaril, var. longifolia, var. stilbocarpa, var. subsessilis, and var. villosa. These varieties are distinguishable mostly by traits related to leaflet shape and indumentation, and calyx indumentation. We carried out morphometric analyses of 14 quantitative (continuous) leaf characters in order to assess the taxonomy of Hymenaea courbaril under the Unified Species Concept framework. Cluster analysis used the Unweighted Pair Group Method with Arithmetic Mean (UPGMA) based on Bray-Curtis dissimilarity matrices. Principal Component Analyses (PCA) were carried out based on the same morphometric matrix. Two sets of Analyses of Similarity and Non Parametric Multivariate Analysis of Variance were carried out to evaluate statistical support (1) for the major groups recovered using UPGMA and PCA, and (2) for the varieties. All analyses recovered three major groups coincident with (1) var. altissima, (2) var. longifolia, and (3) all other varieties. These results, together with geographical and habitat information, were taken as evidence of three separate metapopulation lineages recognized here as three distinct species. Nomenclatural adjustments, including reclassifying formerly misapplied types, are proposed.

  13. Morphological analyses suggest a new taxonomic circumscription for Hymenaea courbaril L. (Leguminosae, Caesalpinioideae

    Directory of Open Access Journals (Sweden)

    Isys Souza

    2014-06-01

    Full Text Available Hymenaea is a genus of the Resin-producing Clade of the tribe Detarieae (Leguminosae: Caesalpinioideae with 14 species. Hymenaea courbaril is the most widespread species of the genus, ranging from southern Mexico to southeastern Brazil. As currently circumscribed, H. courbaril is a polytypic species with six varieties: var. altissima, var. courbaril, var. longifolia, var. stilbocarpa, var. subsessilis, and var. villosa. These varieties are distinguishable mostly by traits related to leaflet shape and indumentation, and calyx indumentation. We carried out morphometric analyses of 14 quantitative (continuous leaf characters in order to assess the taxonomy of H. courbaril under the Unified Species Concept framework. Cluster analysis used the Unweighted Pair Group Method with Arithmetic Mean (UPGMA based on Bray-Curtis dissimilarity matrices. Principal Component Analyses (PCA were carried out based on the same morphometric matrix. Two sets of Analyses of Similarity and Non Parametric Multivariate Analysis of Variance were carried out to evaluate statistical support (1 for the major groups recovered using UPGMA and PCA, and (2 for the varieties. All analyses recovered three major groups coincident with (1 var. altissima, (2 var. longifolia, and (3 all other varieties. These results, together with geographical and habitat information, were taken as evidence of three separate metapopulation lineages recognized here as three distinct species. Nomenclatural adjustments, including reclassifying formerly misapplied types, are proposed.

  14. Quo vadis: Hydrologic inverse analyses using high-performance computing and a D-Wave quantum annealer

    Science.gov (United States)

    O'Malley, D.; Vesselinov, V. V.

    2017-12-01

    Classical microprocessors have had a dramatic impact on hydrology for decades, due largely to the exponential growth in computing power predicted by Moore's law. However, this growth is not expected to continue indefinitely and has already begun to slow. Quantum computing is an emerging alternative to classical microprocessors. Here, we demonstrated cutting edge inverse model analyses utilizing some of the best available resources in both worlds: high-performance classical computing and a D-Wave quantum annealer. The classical high-performance computing resources are utilized to build an advanced numerical model that assimilates data from O(10^5) observations, including water levels, drawdowns, and contaminant concentrations. The developed model accurately reproduces the hydrologic conditions at a Los Alamos National Laboratory contamination site, and can be leveraged to inform decision-making about site remediation. We demonstrate the use of a D-Wave 2X quantum annealer to solve hydrologic inverse problems. This work can be seen as an early step in quantum-computational hydrology. We compare and contrast our results with an early inverse approach in classical-computational hydrology that is comparable to the approach we use with quantum annealing. Our results show that quantum annealing can be useful for identifying regions of high and low permeability within an aquifer. While the problems we consider are small-scale compared to the problems that can be solved with modern classical computers, they are large compared to the problems that could be solved with early classical CPUs. Further, the binary nature of the high/low permeability problem makes it well-suited to quantum annealing, but challenging for classical computers.

  15. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    Science.gov (United States)

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. © The Author(s) 2016.

  16. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...

  17. Analyses and Comparison of Solar Air Heater with Various Rib Roughness using Computational Fluid Dynamics (CFD)

    Science.gov (United States)

    Kumar, K. Ravi; Cheepu, Muralimohan; Srinivas, B.; Venkateswarlu, D.; Pramod Kumar, G.; Shiva, Apireddi

    2018-03-01

    In solar air heater, artificial roughness on absorber plate become prominent technique to improving heat transfer rate of air flowing passage as a result of laminar sublayer. The selection of rib geometries plays important role on friction characteristics and heat transfer rate. Many researchers studying the roughness shapes over the years to investigate the effect of geometries on the performance of friction factor and heat transfer of the solar air heater. The present study made an attempt to develop the different rib shapes utilised for creating artificial rib roughness and its comparison to investigate higher performance of the geometries. The use of computational fluid dynamics software resulted in correlation of friction factor and heat transfer rate. The simulations studies were performed on 2D computational fluid dynamics model and analysed to identify the most effective parameters of relative roughness of the height, width and pitch on major considerations of friction factor and heat transfer. The Reynolds number is varied in a range from 3000 to 20000, in the current study and modelling has conducted on heat transfer and turbulence phenomena by using Reynolds number. The modelling results showed the formation of strong vortex in the main stream flow due to the right angle triangle roughness over the square, rectangle, improved rectangle and equilateral triangle geometries enhanced the heat transfer extension in the solar air heater. The simulation of the turbulence kinetic energy of the geometry suggests the local turbulence kinetic energy has been influenced strongly by the alignments of the right angle triangle.

  18. A comparative study between xerographic, computer-assisted overlay generation and animated-superimposition methods in bite mark analyses.

    Science.gov (United States)

    Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran

    2016-09-01

    This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Verwerking en rapportage van meetgegevens verkregen met een Cobas-Bio centrifugaal-analysator m.b.v. de RIVM-computer

    NARCIS (Netherlands)

    Steentjes; G.M.; Koedam; J.C.

    1985-01-01

    Voor de verwerking van grote hoeveelheden meetgegevens is o.a. voor de Cobas-Bio centrifugaal analysator m.b.v. de RIVM computer een systeem ontwikkeld dat zowel monster (patient) identificatie, als rapportage van alle analyseresultaten verzorgt, zodat door het automatisch uitvoeren van een

  20. Computer-assisted analyses of (/sup 14/C)2-DG autoradiographs employing a general purpose image processing system

    Energy Technology Data Exchange (ETDEWEB)

    Porro, C; Biral, G P [Modena Univ. (Italy). Ist. di Fisiologia Umana; Fonda, S; Baraldi, P [Modena Univ. (Italy). Lab. di Bioingegneria della Clinica Oculistica; Cavazzuti, M [Modena Univ. (Italy). Clinica Neurologica

    1984-09-01

    A general purpose image processing system is described including B/W TV camera, high resolution image processor and display system (TESAK VDC 501), computer (DEC PDP 11/23) and monochrome and color monitors. Images may be acquired from a microscope equipped with a TV camera or using the TV in direct viewing; the A/D converter and the image processor provides fast (40 ms) and precise (512x512 data points) digitization of TV signal with a 256 gray levels maximum resolution. Computer programs have been developed in order to perform qualitative and quantitative analyses of autoradiographs obtained with the 2-DG method, which are written in FORTRAN and MACRO 11 Assembly Language. They include: (1) procedures designed to recognize errors in acquisition due to possible image shading and correct them via software; (2) routines suitable for qualitative analyses of the whole image or selected regions of it, providing the opportunity for pseudocolor coding, statistics, graphic overlays; (3) programs permitting the conversion of gray levels into metabolic rates of glucose utilization and the display of gray- or color-coded metabolic maps.

  1. Proceedings of the international conference on mathematics and computations, reactor physics, and environmental analyses. Volume 1 and 2

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    The International Conference on Mathematics and Computations, Reactor Physics, and Environmental Analyses marks the sixteenth biennial topical meeting of the Mathematics and Computation (M ampersand C) Division of the American Nuclear Society (ANS). This conference combines many traditional features of M ampersand C conferences with several new aspects. The meeting is, for the first time, being held in Portland, Oregon and sponsored by the ANS Eastern Washington Section. Three of the cosponsors - the ANS Reactor Physics Division, the European Nuclear Society, and the Atomic Energy Society of Japan - have participated in a series of such meetings, with very successful results. The fourth cosponsor, the ANS Environmental Science Division, is participating for the first time as a cosponsor of a M ampersand C topical meeting, as a result of the M ampersand C Division's decision to formally include the area of environmental analyses as a major focus of the conference, another 'first.' Separate abstracts have been submitted to the energy database for contributions to this conference

  2. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  3. Genome-Wide Analyses Suggest Mechanisms Involving Early B-Cell Development in Canine IgA Deficiency.

    Directory of Open Access Journals (Sweden)

    Mia Olsson

    Full Text Available Immunoglobulin A deficiency (IgAD is the most common primary immune deficiency disorder in both humans and dogs, characterized by recurrent mucosal tract infections and a predisposition for allergic and other immune mediated diseases. In several dog breeds, low IgA levels have been observed at a high frequency and with a clinical resemblance to human IgAD. In this study, we used genome-wide association studies (GWAS to identify genomic regions associated with low IgA levels in dogs as a comparative model for human IgAD. We used a novel percentile groups-approach to establish breed-specific cut-offs and to perform analyses in a close to continuous manner. GWAS performed in four breeds prone to low IgA levels (German shepherd, Golden retriever, Labrador retriever and Shar-Pei identified 35 genomic loci suggestively associated (p <0.0005 to IgA levels. In German shepherd, three genomic regions (candidate genes include KIRREL3 and SERPINA9 were genome-wide significantly associated (p <0.0002 with IgA levels. A ~20kb long haplotype on CFA28, significantly associated (p = 0.0005 to IgA levels in Shar-Pei, was positioned within the first intron of the gene SLIT1. Both KIRREL3 and SLIT1 are highly expressed in the central nervous system and in bone marrow and are potentially important during B-cell development. SERPINA9 expression is restricted to B-cells and peaks at the time-point when B-cells proliferate into antibody-producing plasma cells. The suggestively associated regions were enriched for genes in Gene Ontology gene sets involving inflammation and early immune cell development.

  4. Benefits of Exercise Training For Computer-Based Staff: A Meta Analyses

    Directory of Open Access Journals (Sweden)

    Mothna Mohammed

    2017-04-01

    Full Text Available Background: Office workers sit down to work for approximately 8 hours a day and, as a result, many of them do not have enough time for any form of physical exercise. This can lead to musculoskeletal discomforts, especially low back pain and recently, many researchers focused on home/office-based exercise training for prevention/treatment of low back pain among this population. Objective: This Meta analyses paper tried to discuss about the latest suggested exercises for the office workers based on the mechanisms and theories behind low back pain among office workers. Method: In this Meta analyses the author tried to collect relevant papers which were published previously on the subject. Google Scholar, Scopus, and PubMed were used as sources to find the articles. Only articles that were published using the same methodology, including office workers, musculoskeletal discomforts, low back pain, and exercise training keywords, were selected. Studies that failed to report sufficient sample statistics, or lacked a substantial review of past academic scholarship and/or clear methodologies, were excluded. Results: Limited evidence regarding the prevention of, and treatment methods for, musculoskeletal discomfort, especially those in the low back, among office workers, is available. The findings showed that training exercises had a significant effect (p<0.05 on low back pain discomfort scores and decreased pain levels in response to office-based exercise training. Conclusion: Office-based exercise training can affect pain/discomfort scores among office workers through positive effects on flexibility and strength of muscles. As such, it should be suggested to occupational therapists as a practical way for the treatment/prevention of low back pain among office workers.

  5. Subtraction radiography and computer assisted densitometric analyses of standardized radiographs. A comparison study with /sup 125/I absorptiometry

    Energy Technology Data Exchange (ETDEWEB)

    Ortmann, L.F.; Dunford, R.; McHenry, K.; Hausmann, E.

    1985-01-01

    A standardized radiographic series of incrementally increasing alveolar crestal defects in skulls were subjected to analyses by subtraction radiography and computer assisted quantitative densitometric analysis. Subjects were able to detect change using subtraction radiography in alveolar bone defects with bone loss in the range of 1-5 percent as measured by /sup 125/I absorptiometry. Quantitative densitometric analyses utilizing radiographic pairs adjusted for differences in contrast (gamma corrected) can be used to follow longitudinal changes at a particular alveolar bone site. Such measurements correlate with change observed by /sup 125/I absorptiometry (r=0.82-0.94). (author).

  6. Analyses of soft tissue from Tyrannosaurus rex suggest the presence of protein.

    Science.gov (United States)

    Schweitzer, Mary Higby; Suo, Zhiyong; Avci, Recep; Asara, John M; Allen, Mark A; Arce, Fernando Teran; Horner, John R

    2007-04-13

    We performed multiple analyses of Tyrannosaurus rex (specimen MOR 1125) fibrous cortical and medullary tissues remaining after demineralization. The results indicate that collagen I, the main organic component of bone, has been preserved in low concentrations in these tissues. The findings were independently confirmed by mass spectrometry. We propose a possible chemical pathway that may contribute to this preservation. The presence of endogenous protein in dinosaur bone may validate hypotheses about evolutionary relationships, rates, and patterns of molecular change and degradation, as well as the chemical stability of molecules over time.

  7. A Novel Quantitative Computed Tomographic Analysis Suggests How Sirolimus Stabilizes Progressive Air Trapping in Lymphangioleiomyomatosis.

    Science.gov (United States)

    Argula, Rahul G; Kokosi, Maria; Lo, Pechin; Kim, Hyun J; Ravenel, James G; Meyer, Cristopher; Goldin, Jonathan; Lee, Hye-Seung; Strange, Charlie; McCormack, Francis X

    2016-03-01

    The Multicenter International Lymphangioleiomyomatosis Efficacy and Safety of Sirolimus (MILES) trial demonstrated that sirolimus stabilized lung function and improved measures of functional performance and quality of life in patients with lymphangioleiomyomatosis. The physiologic mechanisms of these beneficial actions of sirolimus are incompletely understood. To prospectively determine the longitudinal computed tomographic lung imaging correlates of lung function change in MILES patients treated with placebo or sirolimus. We determined the baseline to 12-month change in computed tomographic image-derived lung volumes and the volume of the lung occupied by cysts in the 31 MILES participants (17 in sirolimus group, 14 in placebo group) with baseline and 12-month scans. There was a trend toward an increase in median expiratory cyst volume percentage in the placebo group and a reduction in the sirolimus group (+2.68% vs. +0.97%, respectively; P = 0.10). The computed tomographic image-derived residual volume and the ratio of residual volume to total lung capacity increased more in the placebo group than in the sirolimus group (+214.4 ml vs. +2.9 ml [P = 0.054] and +0.05 ml vs. -0.01 ml [P = 0.0498], respectively). A Markov transition chain analysis of respiratory cycle cyst volume changes revealed greater dynamic variation in the sirolimus group than in the placebo group at the 12-month time point. Collectively, these data suggest that sirolimus attenuates progressive gas trapping in lymphangioleiomyomatosis, consistent with a beneficial effect of the drug on airflow obstruction. We speculate that a reduction in lymphangioleiomyomatosis cell burden around small airways and cyst walls alleviates progressive airflow limitation and facilitates cyst emptying.

  8. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  9. Evolutionary Analyses Suggest a Function of MxB Immunity Proteins Beyond Lentivirus Restriction.

    Directory of Open Access Journals (Sweden)

    Patrick S Mitchell

    2015-12-01

    Full Text Available Viruses impose diverse and dynamic challenges on host defenses. Diversifying selection of codons and gene copy number variation are two hallmarks of genetic innovation in antiviral genes engaged in host-virus genetic conflicts. The myxovirus resistance (Mx genes encode interferon-inducible GTPases that constitute a major arm of the cell-autonomous defense against viral infection. Unlike the broad antiviral activity of MxA, primate MxB was recently shown to specifically inhibit lentiviruses including HIV-1. We carried out detailed evolutionary analyses to investigate whether genetic conflict with lentiviruses has shaped MxB evolution in primates. We found strong evidence for diversifying selection in the MxB N-terminal tail, which contains molecular determinants of MxB anti-lentivirus specificity. However, we found no overlap between previously-mapped residues that dictate lentiviral restriction and those that have evolved under diversifying selection. Instead, our findings are consistent with MxB having a long-standing and important role in the interferon response to viral infection against a broader range of pathogens than is currently appreciated. Despite its critical role in host innate immunity, we also uncovered multiple functional losses of MxB during mammalian evolution, either by pseudogenization or by gene conversion from MxA genes. Thus, although the majority of mammalian genomes encode two Mx genes, this apparent stasis masks the dramatic effects that recombination and diversifying selection have played in shaping the evolutionary history of Mx genes. Discrepancies between our study and previous publications highlight the need to account for recombination in analyses of positive selection, as well as the importance of using sequence datasets with appropriate depth of divergence. Our study also illustrates that evolutionary analyses of antiviral gene families are critical towards understanding molecular principles that govern host

  10. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects.

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  11. Phylogenetic analyses suggest a hybrid origin of the figs (Moraceae: Ficus) that are endemic to the Ogasawara (Bonin) Islands, Japan.

    Science.gov (United States)

    Kusumi, Junko; Azuma, Hiroshi; Tzeng, Hsy-Yu; Chou, Lien-Siang; Peng, Yan-Qiong; Nakamura, Keiko; Su, Zhi-Hui

    2012-04-01

    The Ogasawara Islands are oceanic islands and harbor a unique endemic flora. There are three fig species (Ficus boninsimae, F. nishimurae and F. iidaiana) endemic to the Ogasawara Islands, and these species have been considered to be closely related to Ficus erecta, and to have diverged within the islands. However, this hypothesis remains uncertain. To investigate this issue, we assessed the phylogenetic relationships of the Ogasawara figs and their close relatives occurring in Japan, Taiwan and South China based on six plastid genome regions, nuclear ITS region and two nuclear genes. The plastid genome-based tree indicated a close relationship between the Ogasawara figs and F. erecta, whereas some of the nuclear gene-based trees suggested this relationship was not so close. In addition, the phylogenetic analyses of the pollinating wasps associated with these fig species based on the nuclear 28S rRNA and mitochondrial cytB genes suggested that the fig-pollinating wasps of F. erecta are not sister to those of the Ogasawara figs These results suggest the occurrence of an early hybridization event(s) in the lineage leading to the Ogasawara figs. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. bc-GenExMiner 3.0: new mining module computes breast cancer gene expression correlation analyses.

    Science.gov (United States)

    Jézéquel, Pascal; Frénel, Jean-Sébastien; Campion, Loïc; Guérin-Charbonnel, Catherine; Gouraud, Wilfried; Ricolleau, Gabriel; Campone, Mario

    2013-01-01

    We recently developed a user-friendly web-based application called bc-GenExMiner (http://bcgenex.centregauducheau.fr), which offered the possibility to evaluate prognostic informativity of genes in breast cancer by means of a 'prognostic module'. In this study, we develop a new module called 'correlation module', which includes three kinds of gene expression correlation analyses. The first one computes correlation coefficient between 2 or more (up to 10) chosen genes. The second one produces two lists of genes that are most correlated (positively and negatively) to a 'tested' gene. A gene ontology (GO) mining function is also proposed to explore GO 'biological process', 'molecular function' and 'cellular component' terms enrichment for the output lists of most correlated genes. The third one explores gene expression correlation between the 15 telomeric and 15 centromeric genes surrounding a 'tested' gene. These correlation analyses can be performed in different groups of patients: all patients (without any subtyping), in molecular subtypes (basal-like, HER2+, luminal A and luminal B) and according to oestrogen receptor status. Validation tests based on published data showed that these automatized analyses lead to results consistent with studies' conclusions. In brief, this new module has been developed to help basic researchers explore molecular mechanisms of breast cancer. DATABASE URL: http://bcgenex.centregauducheau.fr

  13. (18)F-FDG positron-emission tomography/computed tomography findings of radiographic lesions suggesting old healed tuberculosis.

    Science.gov (United States)

    Jeong, Yun-Jeong; Paeng, Jin Chul; Nam, Hyun-Yeol; Lee, Ji Sun; Lee, Sang-Min; Yoo, Chul-Gyu; Kim, Young Whan; Han, Sung Koo; Yim, Jae-Joon

    2014-03-01

    The presence of radiographic lesions suggesting old healed tuberculosis (TB) is one of the strongest risk factors for the subsequent development of active TB. We elucidated the metabolic activity of radiographic lesions suggesting old healed TB using (18)F-fluorodeoxyglucose positron emission tomography/computed tomography ((18)F-FDG PET/CT). This cross-sectional study included 63 participants with radiographic lesions suggesting old healed TB and with available (18)F-FDG PET/CT scans. The maximum standardized uptake value (SUVmax) measured in the lesions, the clinical characteristics, results of the tuberculin skin test (TST) and interferon-γ release assay (IGRA) were analyzed. The SUVmax in old healed TB was 1.5 or higher among nine (14.3%) participants. Age (adjusted odds ratio [aOR], 1.23; 95% CI, 1.03-1.46), history of previous TB (aOR, 60.43; 95% CI, 1.71-2131.65), and extent of the lesions (aOR, 1.34; 95% CI, 1.02-1.75) were associated with higher SUVmax. The positive rates for the TST and IGRA were not different between groups with and without increased FDG uptake. Increased FDG uptake on (18)F-FDG PET/CT was observed in a subset of patients with radiographic lesions suggesting old healed TB. Given that the factors associated with increased FDG uptake are known risk factors for TB development, the possibility exists that participants with old healed TB lesions with higher SUV on (18)F-FDG PET/CT scans might be at higher risk for active TB.

  14. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  15. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  16. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  17. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  18. Cross-Disorder Genome-Wide Analyses Suggest a Complex Genetic Relationship Between Tourette Syndrome and Obsessive-Compulsive Disorder

    Science.gov (United States)

    Yu, Dongmei; Mathews, Carol A.; Scharf, Jeremiah M.; Neale, Benjamin M.; Davis, Lea K.; Gamazon, Eric R.; Derks, Eske M.; Evans, Patrick; Edlund, Christopher K.; Crane, Jacquelyn; Fagerness, Jesen A.; Osiecki, Lisa; Gallagher, Patience; Gerber, Gloria; Haddad, Stephen; Illmann, Cornelia; McGrath, Lauren M.; Mayerfeld, Catherine; Arepalli, Sampath; Barlassina, Cristina; Barr, Cathy L.; Bellodi, Laura; Benarroch, Fortu; Berrió, Gabriel Bedoya; Bienvenu, O. Joseph; Black, Donald; Bloch, Michael H.; Brentani, Helena; Bruun, Ruth D.; Budman, Cathy L.; Camarena, Beatriz; Campbell, Desmond D.; Cappi, Carolina; Cardona Silgado, Julio C.; Cavallini, Maria C.; Chavira, Denise A.; Chouinard, Sylvain; Cook, Edwin H.; Cookson, M. R.; Coric, Vladimir; Cullen, Bernadette; Cusi, Daniele; Delorme, Richard; Denys, Damiaan; Dion, Yves; Eapen, Valsama; Egberts, Karin; Falkai, Peter; Fernandez, Thomas; Fournier, Eduardo; Garrido, Helena; Geller, Daniel; Gilbert, Donald; Girard, Simon L.; Grabe, Hans J.; Grados, Marco A.; Greenberg, Benjamin D.; Gross-Tsur, Varda; Grünblatt, Edna; Hardy, John; Heiman, Gary A.; Hemmings, Sian M.J.; Herrera, Luis D.; Hezel, Dianne M.; Hoekstra, Pieter J.; Jankovic, Joseph; Kennedy, James L.; King, Robert A.; Konkashbaev, Anuar I.; Kremeyer, Barbara; Kurlan, Roger; Lanzagorta, Nuria; Leboyer, Marion; Leckman, James F.; Lennertz, Leonhard; Liu, Chunyu; Lochner, Christine; Lowe, Thomas L.; Lupoli, Sara; Macciardi, Fabio; Maier, Wolfgang; Manunta, Paolo; Marconi, Maurizio; McCracken, James T.; Mesa Restrepo, Sandra C.; Moessner, Rainald; Moorjani, Priya; Morgan, Jubel; Muller, Heike; Murphy, Dennis L.; Naarden, Allan L.; Ochoa, William Cornejo; Ophoff, Roel A.; Pakstis, Andrew J.; Pato, Michele T.; Pato, Carlos N.; Piacentini, John; Pittenger, Christopher; Pollak, Yehuda; Rauch, Scott L.; Renner, Tobias; Reus, Victor I.; Richter, Margaret A.; Riddle, Mark A.; Robertson, Mary M.; Romero, Roxana; Rosário, Maria C.; Rosenberg, David; Ruhrmann, Stephan; Sabatti, Chiara; Salvi, Erika; Sampaio, Aline S.; Samuels, Jack; Sandor, Paul; Service, Susan K.; Sheppard, Brooke; Singer, Harvey S.; Smit, Jan H.; Stein, Dan J.; Strengman, Eric; Tischfield, Jay A.; Turiel, Maurizio; Valencia Duarte, Ana V.; Vallada, Homero; Veenstra-VanderWeele, Jeremy; Walitza, Susanne; Walkup, John; Wang, Ying; Weale, Mike; Weiss, Robert; Wendland, Jens R.; Westenberg, Herman G.M.; Yao, Yin; Hounie, Ana G.; Miguel, Euripedes C.; Nicolini, Humberto; Wagner, Michael; Ruiz-Linares, Andres; Cath, Danielle C.; McMahon, William; Posthuma, Danielle; Oostra, Ben A.; Nestadt, Gerald; Rouleau, Guy A.; Purcell, Shaun; Jenike, Michael A.; Heutink, Peter; Hanna, Gregory L.; Conti, David V.; Arnold, Paul D.; Freimer, Nelson; Stewart, S. Evelyn; Knowles, James A.; Cox, Nancy J.; Pauls, David L.

    2014-01-01

    Obsessive-compulsive disorder (OCD) and Tourette Syndrome (TS) are highly heritable neurodevelopmental disorders that are thought to share genetic risk factors. However, the identification of definitive susceptibility genes for these etiologically complex disorders remains elusive. Here, we report a combined genome-wide association study (GWAS) of TS and OCD in 2723 cases (1310 with OCD, 834 with TS, 579 with OCD plus TS/chronic tics (CT)), 5667 ancestry-matched controls, and 290 OCD parent-child trios. Although no individual single nucleotide polymorphisms (SNPs) achieved genome-wide significance, the GWAS signals were enriched for SNPs strongly associated with variations in brain gene expression levels, i.e. expression quantitative loci (eQTLs), suggesting the presence of true functional variants that contribute to risk of these disorders. Polygenic score analyses identified a significant polygenic component for OCD (p=2×10−4), predicting 3.2% of the phenotypic variance in an independent data set. In contrast, TS had a smaller, non-significant polygenic component, predicting only 0.6% of the phenotypic variance (p=0.06). No significant polygenic signal was detected across the two disorders, although the sample is likely underpowered to detect a modest shared signal. Furthermore, the OCD polygenic signal was significantly attenuated when cases with both OCD and TS/CT were included in the analysis (p=0.01). Previous work has shown that TS and OCD have some degree of shared genetic variation. However, the data from this study suggest that there are also distinct components to the genetic architectures of TS and OCD. Furthermore, OCD with co-occurring TS/CT may have different underlying genetic susceptibility compared to OCD alone. PMID:25158072

  19. Recommendations for computer code selection of a flow and transport code to be used in undisturbed vadose zone calculations for TWRS immobilized wastes environmental analyses

    International Nuclear Information System (INIS)

    VOOGD, J.A.

    1999-01-01

    An analysis of three software proposals is performed to recommend a computer code for immobilized low activity waste flow and transport modeling. The document uses criteria restablished in HNF-1839, ''Computer Code Selection Criteria for Flow and Transport Codes to be Used in Undisturbed Vadose Zone Calculation for TWRS Environmental Analyses'' as the basis for this analysis

  20. Investigation of mixed mode - I/II fracture problems - Part 1: computational and experimental analyses

    Directory of Open Access Journals (Sweden)

    O. Demir

    2016-01-01

    Full Text Available In this study, to investigate and understand the nature of fracture behavior properly under in-plane mixed mode (Mode-I/II loading, three-dimensional fracture analyses and experiments of compact tension shear (CTS specimen are performed under different mixed mode loading conditions. Al 7075-T651 aluminum machined from rolled plates in the L-T rolling direction (crack plane is perpendicular to the rolling direction is used in this study. Results from finite element analyses and fracture loads, crack deflection angles obtained from the experiments are presented. To simulate the real conditions in the experiments, contacts are defined between the contact surfaces of the loading devices, specimen and loading pins. Modeling, meshing and the solution of the problem involving the whole assembly, i.e., loading devices, pins and the specimen, with contact mechanics are performed using ANSYSTM. Then, CTS specimen is analyzed separately using a submodeling approach, in which three-dimensional enriched finite elements are used in FRAC3D solver to calculate the resulting stress intensity factors along the crack front. Having performed the detailed computational and experimental studies on the CTS specimen, a new specimen type together with its loading device is also proposed that has smaller dimensions compared to the regular CTS specimen. Experimental results for the new specimen are also presented.

  1. Factor structure of suggestibility revisited: new evidence for direct and indirect suggestibility

    Directory of Open Access Journals (Sweden)

    Romuald Polczyk

    2016-05-01

    Full Text Available Background Yielding to suggestions can be viewed as a relatively stable individual trait, called suggestibility. It has been long proposed that there are two kinds of suggestible influence, and two kinds of suggestibility corresponding to them: direct and indirect. Direct suggestion involves overt unhidden influence, while indirect suggestion concerns influence that is hidden, and the participant does not know that the suggestibility is being measured. So far however, empirical evidence for the existence of the two factors has been scarce. In the present study, more sophisticated and reliable tools for measuring suggestibility were applied than in the previous research, in the hope that better measurement would reveal the factor structure of suggestibility. Two tests of direct suggestibility were used: the Harvard Group Scale of Hypnotic Susceptibility, Form A, measuring hypnotic susceptibility, and the Barber Suggestibility Scale, measuring non-hypnotic direct imaginative suggestibility. Three tests served to measure indirect suggestibility: the Sensory Suggestibility Scale, measuring indirect suggestibility relating to perception; the Gudjonsson Suggestibility Scale, measuring the tendency to yield to suggestive questions and changing answers after negative feedback; and the Emotional Dialogs Tests, measuring the tendency to perceive nonexistent aggression. Participants and procedure In sum, 115 participants were tested, 69 women, 49 men, mean age 22.20 years, SD = 2.20. Participants were tested in two sessions, lasting for a total of four hours. Results Confirmatory factor analyses confirmed the existence of two uncorrelated factors of suggestibility: direct and indirect. Conclusions Suggestibility may indeed involve two factors, direct and indirect, and failure to discover them in previous research may be due to methodological problems.

  2. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  3. Computer codes developed in FRG to analyse hypothetical meltdown accidents

    International Nuclear Information System (INIS)

    Hassmann, K.; Hosemann, J.P.; Koerber, H.; Reineke, H.

    1978-01-01

    It is the purpose of this paper to give the status of all significant computer codes developed in the core melt-down project which is incorporated in the light water reactor safety research program of the Federal Ministry of Research and Technology. For standard pressurized water reactors, results of some computer codes will be presented, describing the course and the duration of the hypothetical core meltdown accident. (author)

  4. Electroencephalographic power and coherence analyses suggest altered brain function in abstinent male heroin-dependent patients

    NARCIS (Netherlands)

    Franken, Ingmar H. A.; Stam, Cornelis J.; Hendriks, Vincent M.; van den Brink, Wim

    2004-01-01

    Previous studies have shown that drug abuse is associated with altered brain function. However, studies of heroin abuse-related brain dysfunctions are scarce. Electroencephalographic ( EEG) power and coherence analyses are two important tools for examining the effects of drugs on brain function. In

  5. RELAP5 analyses of overcooling transients in a pressurized water reactor

    International Nuclear Information System (INIS)

    Bolander, M.A.; Fletcher, C.D.; Ogden, D.M.; Stitt, B.D.; Waterman, M.E.

    1983-01-01

    In support of the Pressurized Thermal Shock Integration Study sponsored by the United States Nuclear Regulatory Commission, the Idaho National Engineering Laboratory has performed analyses of overcooling transients using the RELAP5/MOD1.5 computer code. These analyses were performed for Oconee Plants 1 and 3, which are pressurized water reactors of Babcock and Wilcox lowered-loop design. Results of the RELAP5 analyses are presented, including a comparison with plant data. The capabilities and limitations of the RELAP5/MOD1.5 computer code in analyzing integral plant transients are examined. These analyses require detailed thermal-hydraulic and control system computer models

  6. Heritability and demographic analyses in the large isolated population of Val Borbera suggest advantages in mapping complex traits genes.

    Directory of Open Access Journals (Sweden)

    Michela Traglia

    2009-10-01

    Full Text Available Isolated populations are a useful resource for mapping complex traits due to shared stable environment, reduced genetic complexity and extended Linkage Disequilibrium (LD compared to the general population. Here we describe a large genetic isolate from the North West Apennines, the mountain range that runs through Italy from the North West Alps to the South.The study involved 1,803 people living in 7 villages of the upper Borbera Valley. For this large population cohort, data from genealogy reconstruction, medical questionnaires, blood, anthropometric and bone status QUS parameters were evaluated. Demographic and epidemiological analyses indicated a substantial genetic component contributing to each trait variation as well as overlapping genetic determinants and family clustering for some traits.The data provide evidence for significant heritability of medical relevant traits that will be important in mapping quantitative traits. We suggest that this population isolate is suitable to identify rare variants associated with complex phenotypes that may be difficult to study in larger but more heterogeneous populations.

  7. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  8. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  9. Towards Reproducible Research Data Analyses in LHC Particle Physics

    CERN Document Server

    Simko, Tibor

    2017-01-01

    The reproducibility of the research data analysis requires having access not only to the original datasets, but also to the computing environment, the analysis software and the workflow used to produce the original results. We present the nascent CERN Analysis Preservation platform with a set of tools developed to support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. The presentation will focus on three pillars: (i) capturing structured knowledge information about data analysis processes; (ii) capturing the computing environment, the software code, the datasets, the configuration and other information assets used in data analyses; (iii) re-instantiating of preserved analyses on a containerised computing cloud for the purposes of re-validation and re-interpretation.

  10. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Petrie, L.M.; Jordon, W.C. [Oak Ridge National Lab., TN (United States); Edwards, A.L. [Oak Ridge National Lab., TN (United States)]|[Lawrence Livermore National Lab., CA (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries.

  11. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    International Nuclear Information System (INIS)

    Petrie, L.M.; Jordon, W.C.; Edwards, A.L.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries

  12. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation

    Directory of Open Access Journals (Sweden)

    Vincenzo G. Fiore

    2017-08-01

    Full Text Available The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed

  13. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation.

    Science.gov (United States)

    Fiore, Vincenzo G; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features

  14. Analyses and computer code developments for accident-induced thermohydraulic transients in water-cooled nuclear reactor systems

    International Nuclear Information System (INIS)

    Wulff, W.

    1977-01-01

    A review is presented on the development of analyses and computer codes for the prediction of thermohydraulic transients in nuclear reactor systems. Models for the dynamics of two-phase mixtures are summarized. Principles of process, reactor component and reactor system modeling are presented, as well as the verification of these models by comparing predicted results with experimental data. Codes of major importance are described, which have recently been developed or are presently under development. The characteristics of these codes are presented in terms of governing equations, solution techniques and code structure. Current efforts and problems of code verification are discussed. A summary is presented of advances which are necessary for reducing the conservatism currently implied in reactor hydraulics codes for safety assessment

  15. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  16. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Tamaki, Hitoshi; Kanai, Shigeru

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  17. Interpretive analysis of 85 systematic reviews suggests that narrative syntheses and meta‐analyses are incommensurate in argumentation

    Science.gov (United States)

    O'Mara‐Eves, A.; Thomas, J.; Brunton, G.; Caird, J.; Petticrew, M.

    2016-01-01

    Using Toulmin's argumentation theory, we analysed the texts of systematic reviews in the area of workplace health promotion to explore differences in the modes of reasoning embedded in reports of narrative synthesis as compared with reports of meta‐analysis. We used framework synthesis, grounded theory and cross‐case analysis methods to analyse 85 systematic reviews addressing intervention effectiveness in workplace health promotion. Two core categories, or ‘modes of reasoning’, emerged to frame the contrast between narrative synthesis and meta‐analysis: practical–configurational reasoning in narrative synthesis (‘what is going on here? What picture emerges?’) and inferential–predictive reasoning in meta‐analysis (‘does it work, and how well? Will it work again?’). Modes of reasoning examined quality and consistency of the included evidence differently. Meta‐analyses clearly distinguished between warrant and claim, whereas narrative syntheses often presented joint warrant–claims. Narrative syntheses and meta‐analyses represent different modes of reasoning. Systematic reviewers are likely to be addressing research questions in different ways with each method. It is important to consider narrative synthesis in its own right as a method and to develop specific quality criteria and understandings of how it is carried out, not merely as a complement to, or second‐best option for, meta‐analysis. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. PMID:27860329

  18. Bringing computational science to the public.

    Science.gov (United States)

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  19. Analysis of electronic circuits using digital computers; L'analyse des circuits electroniques par les calculateurs numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Tapu, C [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-07-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [French] Differents programmes ont ete proposes pour l'etude des circuits electroniques a l'aide des calculateurs. On montre comment on peut utiliser le programme ECAP, mis au point par I. B. M., pour etudier le comportement d'un amplificateur operationnel, a differents points de vue: analyse en courant continu, courant alternatif et regime transitoire, optimalisation du gain en boucle ouverte, etude de la fiabilite. (auteur)

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Landers, N.F.; Petrie, L.M.; Knight, J.R. [Oak Ridge National Lab., TN (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries.

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    International Nuclear Information System (INIS)

    Landers, N.F.; Petrie, L.M.; Knight, J.R.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries

  2. Computer simulations suggest that acute correction of hyperglycaemia with an insulin bolus protocol might be useful in brain FDG PET

    Energy Technology Data Exchange (ETDEWEB)

    Buchert, R.; Brenner, W.; Apostolova, I.; Mester, J.; Clausen, M. [University Medical Center Hamburg-Eppendorf (Germany). Dept. of Nuclear Medicine; Santer, R. [University Medical Center Hamburg-Eppendorf (Germany). Center for Gynaecology, Obstetrics and Paediatrics; Silverman, D.H.S. [David Geffen School of Medicine at UCLA, Los Angeles, CA (United States). Dept. of Molecular and Medical Pharmacology

    2009-07-01

    FDG PET in hyperglycaemic subjects often suffers from limited statistical image quality, which may hamper visual and quantitative evaluation. In our study the following insulin bolus protocol is proposed for acute correction of hyperglycaemia (> 7.0 mmol/l) in brain FDG PET. (i) Intravenous bolus injection of short-acting insulin, one I.E. for each 0.6 mmol/l blood glucose above 7.0. (ii) If 20 min after insulin administration plasma glucose is {<=} 7.0 mmol/l, proceed to (iii). If insulin has not taken sufficient effect step back to (i). Compute insulin dose with the updated blood glucose level. (iii) Wait further 20 min before injection of FDG. (iv) Continuous supervision of the patient during the whole scanning procedure. The potential of this protocol for improvement of image quality in brain FDG PET in hyperglycaemic subjects was evaluated by computer simulations within the Sokoloff model. A plausibility check of the prediction of the computer simulations on the magnitude of the effect that might be achieved by correction of hyperglycaemia was performed by retrospective evaluation of the relation between blood glucose level and brain FDG uptake in 89 subjects in whom FDG PET had been performed for diagnosis of Alzheimer's disease. The computer simulations suggested that acute correction of hyperglycaemia according to the proposed bolus insulin protocol might increase the FDG uptake of the brain by up to 80%. The magnitude of this effect was confirmed by the patient data. The proposed management protocol for acute correction of hyperglycaemia with insulin has the potential to significantly improve the statistical quality of brain FDG PET images. This should be confirmed in a prospective study in patients. (orig.)

  3. Computer simulations suggest that acute correction of hyperglycaemia with an insulin bolus protocol might be useful in brain FDG PET

    International Nuclear Information System (INIS)

    Buchert, R.; Brenner, W.; Apostolova, I.; Mester, J.; Clausen, M.; Santer, R.; Silverman, D.H.S.

    2009-01-01

    FDG PET in hyperglycaemic subjects often suffers from limited statistical image quality, which may hamper visual and quantitative evaluation. In our study the following insulin bolus protocol is proposed for acute correction of hyperglycaemia (> 7.0 mmol/l) in brain FDG PET. (i) Intravenous bolus injection of short-acting insulin, one I.E. for each 0.6 mmol/l blood glucose above 7.0. (ii) If 20 min after insulin administration plasma glucose is ≤ 7.0 mmol/l, proceed to (iii). If insulin has not taken sufficient effect step back to (i). Compute insulin dose with the updated blood glucose level. (iii) Wait further 20 min before injection of FDG. (iv) Continuous supervision of the patient during the whole scanning procedure. The potential of this protocol for improvement of image quality in brain FDG PET in hyperglycaemic subjects was evaluated by computer simulations within the Sokoloff model. A plausibility check of the prediction of the computer simulations on the magnitude of the effect that might be achieved by correction of hyperglycaemia was performed by retrospective evaluation of the relation between blood glucose level and brain FDG uptake in 89 subjects in whom FDG PET had been performed for diagnosis of Alzheimer's disease. The computer simulations suggested that acute correction of hyperglycaemia according to the proposed bolus insulin protocol might increase the FDG uptake of the brain by up to 80%. The magnitude of this effect was confirmed by the patient data. The proposed management protocol for acute correction of hyperglycaemia with insulin has the potential to significantly improve the statistical quality of brain FDG PET images. This should be confirmed in a prospective study in patients. (orig.)

  4. A response-modeling alternative to surrogate models for support in computational analyses

    International Nuclear Information System (INIS)

    Rutherford, Brian

    2006-01-01

    Often, the objectives in a computational analysis involve characterization of system performance based on some function of the computed response. In general, this characterization includes (at least) an estimate or prediction for some performance measure and an estimate of the associated uncertainty. Surrogate models can be used to approximate the response in regions where simulations were not performed. For most surrogate modeling approaches, however (1) estimates are based on smoothing of available data and (2) uncertainty in the response is specified in a point-wise (in the input space) fashion. These aspects of the surrogate model construction might limit their capabilities. One alternative is to construct a probability measure, G(r), for the computer response, r, based on available data. This 'response-modeling' approach will permit probability estimation for an arbitrary event, E(r), based on the computer response. In this general setting, event probabilities can be computed: prob(E)=∫ r I(E(r))dG(r) where I is the indicator function. Furthermore, one can use G(r) to calculate an induced distribution on a performance measure, pm. For prediction problems where the performance measure is a scalar, its distribution F pm is determined by: F pm (z)=∫ r I(pm(r)≤z)dG(r). We introduce response models for scalar computer output and then generalize the approach to more complicated responses that utilize multiple response models

  5. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  6. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  7. Steady-state and accident analyses of PBMR with the computer code SPECTRA

    International Nuclear Information System (INIS)

    Stempniewicz, Marek M.

    2002-01-01

    The SPECTRA code is an accident analysis code developed at NRG. It is designed for thermal-hydraulic analyses of nuclear or conventional power plants. The code is capable of analysing the whole power plant, including reactor vessel, primary system, various control and safety systems, containment and reactor building. The aim of the work presented in this paper was to prepare a preliminary thermal-hydraulic model of PBMR for SPECTRA, and perform steady state and accident analyses. In order to assess SPECTRA capability to model the PBMR reactors, a model of the INCOGEN system has been prepared first. Steady state and accident scenarios were analyzed for INCOGEN configuration. Results were compared to the results obtained earlier with INAS and OCTOPUS/PANTHERMIX. A good agreement was obtained. Results of accident analyses with PBMR model showed qualitatively good results. It is concluded that SPECTRA is a suitable tool for analyzing High Temperature Reactors, such as INCOGEN or for example PBMR (Pebble Bed Modular Reactor). Analyses of INCOGEN and PBMR systems showed that in all analyzed cases the fuel temperatures remained within the acceptable limits. Consequently there is no danger of release of radioactivity to the environment. It may be concluded that those are promising designs for future safe industrial reactors. (author)

  8. Massive Cloud Computing Processing of P-SBAS Time Series for Displacement Analyses at Large Spatial Scale

    Science.gov (United States)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  9. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  10. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  11. Mitarbeiteranreizsysteme und Innovationserfolg (Employee suggestion schemes and innovation success)

    OpenAIRE

    Czarnitzki, Dirk; Kraft, Kornelius

    2008-01-01

    "We discuss the determinants of a successful implementation of an employee suggestion scheme and other measures to stimulate innovation success. Subsequently the effects of the employee suggestion schemes are investigated empirically. We analyse the realisation of cost reductions and alternatively sales expansion due to quality improvements. It turns out that employee suggestion schemes have a positive effect on cost efficiency and sales growth. Delegation of decision authority reduces produc...

  12. Improving Climate Communication through Comprehensive Linguistic Analyses Using Computational Tools

    Science.gov (United States)

    Gann, T. M.; Matlock, T.

    2014-12-01

    An important lesson on climate communication research is that there is no single way to reach out and inform the public. Different groups conceptualize climate issues in different ways and different groups have different values and assumptions. This variability makes it extremely difficult to effectively and objectively communicate climate information. One of the main challenges is the following: How do we acquire a better understanding of how values and assumptions vary across groups, including political groups? A necessary starting point is to pay close attention to the linguistic content of messages used across current popular media sources. Careful analyses of that information—including how it is realized in language for conservative and progressive media—may ultimately help climate scientists, government agency officials, journalists and others develop more effective messages. Past research has looked at partisan media coverage of climate change, but little attention has been given to the fine-grained linguistic content of such media. And when researchers have done detailed linguistic analyses, they have relied primarily on hand-coding, an approach that is costly, labor intensive, and time-consuming. Our project, building on recent work on partisan news media (Gann & Matlock, 2014; under review) uses high dimensional semantic analyses and other methods of automated classification techniques from the field of natural language processing to quantify how climate issues are characterized in media sources that differ according to political orientation. In addition to discussing varied linguistic patterns, we share new methods for improving climate communication for varied stakeholders, and for developing better assessments of their effectiveness.

  13. A suggestion for quality assessment in systematic reviews of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-04-01

    Full Text Available OBJECTIVES: It is important to control the quality level of the observational studies in conducting meta-analyses. The Newcastle-Ottawa Scale (NOS is a representative tool used for this purpose. We investigated the relationship between high-quality (HQ defined using NOS and the results of subgroup analysis according to study design. METHODS: We selected systematic review studies with meta-analysis which performed a quality evaluation on observational studies of diet and cancer by NOS. HQ determinations and the distribution of study designs were examined. Subgroup analyses according to quality level as defined by the NOS were also extracted. Equivalence was evaluated based on the summary effect size (sES and 95% confidence intervals computed in the subgroup analysis. RESULTS: The meta-analysis results of the HQ and cohort groups were identical. The overall sES, which was obtained by combining the sES when equivalence was observed between the cohort and case-control groups, also showed equivalence. CONCLUSIONS: The results of this study suggest that it is more reasonable to control for quality level by performing subgroup analysis according to study design rather than by using HQ based on the NOS quality assessment tool.

  14. Computational biology for ageing

    Science.gov (United States)

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  15. Two suggestions to improve the utilization of ISABELLE

    International Nuclear Information System (INIS)

    Thorndike, A.

    1976-01-01

    Two suggestions are outlined which are aimed at improving the efficiency of work in experimental areas by improving the information available to experimenters. The ideas relate to: (1) communications between the experimental hall and the data acquisition room; and (2) beam information via computer

  16. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  17. Effects of stereotypes and suggestion on memory.

    Science.gov (United States)

    Shechory, Mally; Nachson, Israel; Glicksohn, Joseph

    2010-02-01

    In this study, the interactive effect of stereotype and suggestion on accuracy of memory was examined by presenting 645 participants (native Israelis and immigrants from the former Soviet Union and Ethiopia) with three versions of a story about a worker who is waiting in a manager's office for a meeting. All versions were identical except for the worker's name, which implied a Russian or an Ethiopian immigrant or a person of no ethnic origin. Each participant was presented with one version of the story. After an hour delay, the participants' memories were tested via two questionnaires that differed in terms of level of suggestion. Data analyses show that (a) when a suggestion matched the participant's stereotypical perception, the suggestion was incorporated into memory but (b) when the suggestion contradicted the stereotype, it did not influence memory. The conclusion was that recall is influenced by stereotypes but can be enhanced by compatible suggestions.

  18. Spectral Coefficient Analyses of Word-Initial Stop Consonant Productions Suggest Similar Anticipatory Coarticulation for Stuttering and Nonstuttering Adults.

    Science.gov (United States)

    Maruthy, Santosh; Feng, Yongqiang; Max, Ludo

    2018-03-01

    A longstanding hypothesis about the sensorimotor mechanisms underlying stuttering suggests that stuttered speech dysfluencies result from a lack of coarticulation. Formant-based measures of either the stuttered or fluent speech of children and adults who stutter have generally failed to obtain compelling evidence in support of the hypothesis that these individuals differ in the timing or degree of coarticulation. Here, we used a sensitive acoustic technique-spectral coefficient analyses-that allowed us to compare stuttering and nonstuttering speakers with regard to vowel-dependent anticipatory influences as early as the onset burst of a preceding voiceless stop consonant. Eight adults who stutter and eight matched adults who do not stutter produced C 1 VC 2 words, and the first four spectral coefficients were calculated for one analysis window centered on the burst of C 1 and two subsequent windows covering the beginning of the aspiration phase. Findings confirmed that the combined use of four spectral coefficients is an effective method for detecting the anticipatory influence of a vowel on the initial burst of a preceding voiceless stop consonant. However, the observed patterns of anticipatory coarticulation showed no statistically significant differences, or trends toward such differences, between the stuttering and nonstuttering groups. Combining the present results for fluent speech in one given phonetic context with prior findings from both stuttered and fluent speech in a variety of other contexts, we conclude that there is currently no support for the hypothesis that the fluent speech of individuals who stutter is characterized by limited coarticulation.

  19. Microarray and bioinformatic analyses suggest models for carbon metabolism in the autotroph Acidithiobacillus ferrooxidans

    Energy Technology Data Exchange (ETDEWEB)

    C. Appia-ayme; R. Quatrini; Y. Denis; F. Denizot; S. Silver; F. Roberto; F. Veloso; J. Valdes; J. P. Cardenas; M. Esparza; O. Orellana; E. Jedlicki; V. Bonnefoy; D. Holmes

    2006-09-01

    Acidithiobacillus ferrooxidans is a chemolithoautotrophic bacterium that uses iron or sulfur as an energy and electron source. Bioinformatic analysis was used to identify putative genes and potential metabolic pathways involved in CO2 fixation, 2P-glycolate detoxification, carboxysome formation and glycogen utilization in At. ferrooxidans. Microarray transcript profiling was carried out to compare the relative expression of the predicted genes of these pathways when the microorganism was grown in the presence of iron versus sulfur. Several gene expression patterns were confirmed by real-time PCR. Genes for each of the above predicted pathways were found to be organized into discrete clusters. Clusters exhibited differential gene expression depending on the presence of iron or sulfur in the medium. Concordance of gene expression within each cluster, suggested that they are operons Most notably, clusters of genes predicted to be involved in CO2 fixation, carboxysome formation, 2P-glycolate detoxification and glycogen biosynthesis were up-regulated in sulfur medium, whereas genes involved in glycogen utilization were preferentially expressed in iron medium. These results can be explained in terms of models of gene regulation that suggest how A. ferrooxidans can adjust its central carbon management to respond to changing environmental conditions.

  20. Computational Analyses of Complex Flows with Chemical Reactions

    Science.gov (United States)

    Bae, Kang-Sik

    The heat and mass transfer phenomena in micro-scale for the mass transfer phenomena on drug in cylindrical matrix system, the simulation of oxygen/drug diffusion in a three dimensional capillary network, and a reduced chemical kinetic modeling of gas turbine combustion for Jet propellant-10 have been studied numerically. For the numerical analysis of the mass transfer phenomena on drug in cylindrical matrix system, the governing equations are derived from the cylindrical matrix systems, Krogh cylinder model, which modeling system is comprised of a capillary to a surrounding cylinder tissue along with the arterial distance to veins. ADI (Alternative Direction Implicit) scheme and Thomas algorithm are applied to solve the nonlinear partial differential equations (PDEs). This study shows that the important factors which have an effect on the drug penetration depth to the tissue are the mass diffusivity and the consumption of relevant species during the time allowed for diffusion to the brain tissue. Also, a computational fluid dynamics (CFD) model has been developed to simulate the blood flow and oxygen/drug diffusion in a three dimensional capillary network, which are satisfied in the physiological range of a typical capillary. A three dimensional geometry has been constructed to replicate the one studied by Secomb et al. (2000), and the computational framework features a non-Newtonian viscosity model for blood, the oxygen transport model including in oxygen-hemoglobin dissociation and wall flux due to tissue absorption, as well as an ability to study the diffusion of drugs and other materials in the capillary streams. Finally, a chemical kinetic mechanism of JP-10 has been compiled and validated for a wide range of combustion regimes, covering pressures of 1atm to 40atm with temperature ranges of 1,200 K--1,700 K, which is being studied as a possible Jet propellant for the Pulse Detonation Engine (PDE) and other high-speed flight applications such as hypersonic

  1. Theoretical and Computational Analyses of Bernoulli Levitation Flows

    International Nuclear Information System (INIS)

    Nam, Jong Soon; Kim, Gyu Wan; Kim, Jin Hyeon; Kim, Heuy Dong

    2013-01-01

    Pneumatic levitation is based upon Bernoulli's principle. However, this method is known to require a large gas flow rate that can lead to an increase in the cost of products. In this case, the gas flow rate should be increased, and the compressible effects of the gas may be of practical importance. In the present study, a computational fluid dynamics method has been used to obtain insights into Bernoulli levitation flows. Three-dimensional compressible Navier-Stokes equations in combination with the SST k-ω turbulence model were solved using a fully implicit finite volume scheme. The gas flow rate, work piece diameter,and clearance gap between the work piece and the circular cylinder were varied to investigate the flow characteristics inside. It is known that there is an optimal clearance gap for the lifting force and that increasing the supply gas flow rate results in a larger lifting force

  2. Theoretical and Computational Analyses of Bernoulli Levitation Flows

    Energy Technology Data Exchange (ETDEWEB)

    Nam, Jong Soon; Kim, Gyu Wan; Kim, Jin Hyeon; Kim, Heuy Dong [Andong Nat' l Univ., Andong (Korea, Republic of)

    2013-07-15

    Pneumatic levitation is based upon Bernoulli's principle. However, this method is known to require a large gas flow rate that can lead to an increase in the cost of products. In this case, the gas flow rate should be increased, and the compressible effects of the gas may be of practical importance. In the present study, a computational fluid dynamics method has been used to obtain insights into Bernoulli levitation flows. Three-dimensional compressible Navier-Stokes equations in combination with the SST k-{omega} turbulence model were solved using a fully implicit finite volume scheme. The gas flow rate, work piece diameter,and clearance gap between the work piece and the circular cylinder were varied to investigate the flow characteristics inside. It is known that there is an optimal clearance gap for the lifting force and that increasing the supply gas flow rate results in a larger lifting force.

  3. Mitochondrial genome analyses suggest multiple Trichuris species in humans, baboons, and pigs from different geographical regions

    DEFF Research Database (Denmark)

    Hawash, Mohamed B. F.; Andersen, Lee O.; Gasser, Robin B.

    2015-01-01

    Trichuris from françois' leaf monkey, suggesting multiple whipworm species circulating among non-human primates. The genetic and protein distances between pig Trichuris from Denmark and other regions were roughly 9% and 6%, respectively, while Chinese and Ugandan whipworms were more closely related......) suggesting that they represented different species. Trichuris from the olive baboon in US was genetically related to human Trichuris in China, while the other from the hamadryas baboon in Denmark was nearly identical to human Trichuris from Uganda. Baboon-derived Trichuris was genetically distinct from......BACKGROUND: The whipworms Trichuris trichiura and Trichuris suis are two parasitic nematodes of humans and pigs, respectively. Although whipworms in human and non-human primates historically have been referred to as T. trichiura, recent reports suggest that several Trichuris spp. are found...

  4. Development and application of computer codes for multidimensional thermalhydraulic analyses of nuclear reactor components

    International Nuclear Information System (INIS)

    Carver, M.B.

    1983-01-01

    Components of reactor systems and related equipment are identified in which multidimensional computational thermal hydraulics can be used to advantage to assess and improve design. Models of single- and two-phase flow are reviewed, and the governing equations for multidimensional analysis are discussed. Suitable computational algorithms are introduced, and sample results from the application of particular multidimensional computer codes are given

  5. Bayesian analyses of Yemeni mitochondrial genomes suggest multiple migration events with Africa and Western Eurasia.

    Science.gov (United States)

    Vyas, Deven N; Kitchen, Andrew; Miró-Herrans, Aida T; Pearson, Laurel N; Al-Meeri, Ali; Mulligan, Connie J

    2016-03-01

    Anatomically, modern humans are thought to have migrated out of Africa ∼60,000 years ago in the first successful global dispersal. This initial migration may have passed through Yemen, a region that has experienced multiple migrations events with Africa and Eurasia throughout human history. We use Bayesian phylogenetics to determine how ancient and recent migrations have shaped Yemeni mitogenomic variation. We sequenced 113 mitogenomes from multiple Yemeni regions with a focus on haplogroups M, N, and L3(xM,N) as these groups have the oldest evolutionary history outside of Africa. We performed Bayesian evolutionary analyses to generate time-measured phylogenies calibrated by Neanderthal and Denisovan mitogenomes in order to determine the age of Yemeni-specific clades. As defined by Yemeni monophyly, Yemeni in situ evolution is limited to the Holocene or latest Pleistocene (ages of clades in subhaplogroups L3b1a1a, L3h2, L3x1, M1a1f, M1a5, N1a1a3, and N1a3 range from 2 to 14 kya) and is often situated within broader Horn of Africa/southern Arabia in situ evolution (L3h2, L3x1, M1a1f, M1a5, and N1a1a3 ages range from 7 to 29 kya). Five subhaplogroups show no monophyly and are candidates for Holocene migration into Yemen (L0a2a2a, L3d1a1a, L3i2, M1a1b, and N1b1a). Yemeni mitogenomes are largely the product of Holocene migration, and subsequent in situ evolution, from Africa and western Eurasia. However, we hypothesize that recent population movements may obscure the genetic signature of more ancient migrations. Additional research, e.g., analyses of Yemeni nuclear genetic data, is needed to better reconstruct the complex population and migration histories associated with Out of Africa. © 2015 Wiley Periodicals, Inc.

  6. Computed tomography characteristics suggestive of spontaneous resolution of chronic subdural hematoma

    Energy Technology Data Exchange (ETDEWEB)

    Horikoshi, Toru; Naganuma, Hirofumi; Fukasawa, Isao; Uchida, Mikito; Nukui, Hideaki [Yamanashi Medical Univ., Tamaho (Japan)

    1998-09-01

    The clinical and radiological characteristics of self-resolving hematoma were assessed retrospectively in a series of patients with chronic subdural hematomas (SDHs) treated over a recent 6-year period in a local hospital. Spontaneous resolution was observed in five of 27 hematomas occurring in four of 23 patients. Clinical and radiological findings of the four cases were compared to those of the remaining 19 cases. All spontaneously resolving SDHs were asymptomatic or only caused mild transient headache, and disappeared within 4 to 9 months after head injury. All spontaneously resolving SDHs were located in the frontal region, and maximum thickness and midline displacement were less than those in the other 19 patients who were symptomatic and underwent surgery. Computed tomography demonstrated a low density line between the hematoma and the cerebral cortex, indicative of remaining cerebrospinal fluid space in four of five hematomas. Spontaneously resolving SDH is more frequent than formerly expected. Asymptomatic SDHs localized in the frontal region with small mass signs can be expected to disappear spontaneously without deterioration. (author)

  7. Evaluation of the diagnostic accuracy of four-view radiography and conventional computed tomography analysing sacral and pelvic fractures in dogs.

    Science.gov (United States)

    Stieger-Vanegas, S M; Senthirajah, S K J; Nemanic, S; Baltzer, W; Warnock, J; Bobe, G

    2015-01-01

    The purpose of our study was (1) to determine whether four-view radiography of the pelvis is as reliable and accurate as computed tomography (CT) in diagnosing sacral and pelvic fractures, in addition to coxofemoral and sacroiliac joint subluxation or luxation, and (2) to evaluate the effect of the amount of training in reading diagnostic imaging studies on the accuracy of diagnosing sacral and pelvic fractures in dogs. Sacral and pelvic fractures were created in 11 canine cadavers using a lateral impactor. In all cadavers, frog-legged ventro-dorsal, lateral, right and left ventro-45°-medial to dorsolateral oblique frog leg ("rollover 45-degree view") radiographs and a CT of the pelvis were obtained. Two radiologists, two surgeons and two veterinary students classified fractures using a confidence scale and noted the duration of evaluation for each imaging modality and case. The imaging results were compared to gross dissection. All evaluators required significantly more time to analyse CT images compared to radiographic images. Sacral and pelvic fractures, specifically those of the sacral body, ischiatic table, and the pubic bone, were more accurately diagnosed using CT compared to radiography. Fractures of the acetabulum and iliac body were diagnosed with similar accuracy (at least 86%) using either modality. Computed tomography is a better method for detecting canine sacral and some pelvic fractures compared to radiography. Computed tomography provided an accuracy of close to 100% in persons trained in evaluating CT images.

  8. Mathematical and computational analyses of cracking formation fracture morphology and its evolution in engineering materials and structures

    CERN Document Server

    Sumi, Yoichi

    2014-01-01

    This book is about the pattern formation and the evolution of crack propagation in engineering materials and structures, bridging mathematical analyses of cracks based on singular integral equations, to computational simulation of engineering design. The first two parts of this book focus on elasticity and fracture and provide the basis for discussions on fracture morphology and its numerical simulation, which may lead to a simulation-based fracture control in engineering structures. Several design concepts are discussed for the prevention of fatigue and fracture in engineering structures, including safe-life design, fail-safe design, damage tolerant design. After starting with basic elasticity and fracture theories in parts one and two, this book focuses on the fracture morphology that develops due to the propagation of brittle cracks or fatigue cracks.   In part three, the mathematical analysis of a curved crack is precisely described, based on the perturbation method. The stability theory of interactive ...

  9. Manual vs. computer-assisted sperm analysis: can CASA replace manual assessment of human semen in clinical practice?

    Science.gov (United States)

    Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr

    2017-01-01

    The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.

  10. SCALE Graphical Developments for Improved Criticality Safety Analyses

    International Nuclear Information System (INIS)

    Barnett, D.L.; Bowman, S.M.; Horwedel, J.E.; Petrie, L.M.

    1999-01-01

    New computer graphic developments at Oak Ridge National Ridge National Laboratory (ORNL) are being used to provide visualization of criticality safety models and calculational results as well as tools for criticality safety analysis input preparation. The purpose of this paper is to present the status of current development efforts to continue to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system. Applications for criticality safety analysis in the areas of 3-D model visualization, input preparation and execution via a graphical user interface (GUI), and two-dimensional (2-D) plotting of results are discussed

  11. The smallest cells pose the biggest problems: high-performance computing and the analysis of metagenome sequence data

    International Nuclear Information System (INIS)

    Edwards, R A

    2008-01-01

    New high-throughput DNA sequencing technologies have revolutionized how scientists study the organisms around us. In particular, microbiology - the study of the smallest, unseen organisms that pervade our lives - has embraced these new techniques to characterize and analyze the cellular constituents and use this information to develop novel tools, techniques, and therapeutics. So-called next-generation DNA sequencing platforms have resulted in huge increases in the amount of raw data that can be rapidly generated. Argonne National Laboratory developed the premier platform for the analysis of this new data (mg-rast) that is used by microbiologists worldwide. This paper uses the accounting from the computational analysis of more than 10,000,000,000 bp of DNA sequence data, describes an analysis of the advanced computational requirements, and suggests the level of analysis that will be essential as microbiologists move to understand how these tiny organisms affect our every day lives. The results from this analysis indicate that data analysis is a linear problem, but that most analyses are held up in queues. With sufficient resources, computations could be completed in a few hours for a typical dataset. These data also suggest execution times that delimit timely completion of computational analyses, and provide bounds for problematic processes

  12. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  13. DNA-energetics-based analyses suggest additional genes in ...

    Indian Academy of Sciences (India)

    2012-06-25

    Jun 25, 2012 ... sequence with its homologs in the annotated databases using alignment ... in predictions and on the development of next-generation prediction servers ... sequences, but were not annotated in the organism studied. ...... Biopolymers 52 29–56 .... improvement for identifying translation initiation sites in micro-.

  14. Metagenome-based diversity analyses suggest a strong locality signal for bacterial communities associated with oyster aquaculture farms in Ofunato Bay

    KAUST Repository

    Kobiyama, Atsushi

    2018-04-30

    Ofunato Bay, in Japan, is the home of buoy-and-rope-type oyster aquaculture activities. Since the oysters filter suspended materials and excrete organic matter into the seawater, bacterial communities residing in its vicinity may show dynamic changes depending on the oyster culture activities. We employed a shotgun metagenomic technique to study bacterial communities near oyster aquaculture facilities at the center of the bay (KSt. 2) and compared the results with those of two other localities far from the station, one to the northeast (innermost bay, KSt. 1) and the other to the southwest (bay entrance, KSt. 3). Seawater samples were collected every month from January to December 2015 from the surface (1 m) and deeper (8 or 10 m) layers of the three locations, and the sequentially filtered fraction on 0.2-μm membranes was sequenced on an Illumina MiSeq system. The acquired reads were uploaded to MG-RAST for KEGG functional abundance analysis, while taxonomic analyses at the phylum and genus levels were performed using MEGAN after parsing the BLAST output. Discrimination analyses were then performed using the ROC-AUC value of the cross validation, targeting the depth (shallow or deep), locality [(KSt. 1 + KSt. 2) vs. KSt 3; (KSt. 1 + KSt. 3) vs. KSt. 2 or the (KSt. 2 + KSt. 3) vs. KSt. 1] and seasonality (12 months). The matrix discrimination analysis on the adjacent 2 continuous seasons by ROC-AUC, which was based on the datasets that originated from different depths, localities and months, showed the strongest discrimination signal on the taxonomy matrix at the phylum level for the datasets from July to August compared with those from September to June, while the KEGG matrix showed the strongest signal for the datasets from March to June compared with those from July to February. Then, the locality combination was subjected to the same ROC-AUC discrimination analysis, resulting in significant differences between KSt. 2 and KSt. 1 + KSt. 3

  15. Metagenome-based diversity analyses suggest a strong locality signal for bacterial communities associated with oyster aquaculture farms in Ofunato Bay

    KAUST Repository

    Kobiyama, Atsushi; Ikeo, Kazuho; Reza, Md. Shaheed; Rashid, Jonaira; Yamada, Yuichiro; Ikeda, Yuri; Ikeda, Daisuke; Mizusawa, Nanami; Sato, Shigeru; Ogata, Takehiko; Jimbo, Mitsuru; Kudo, Toshiaki; Kaga, Shinnosuke; Watanabe, Shiho; Naiki, Kimiaki; Kaga, Yoshimasa; Mineta, Katsuhiko; Bajic, Vladimir B.; Gojobori, Takashi; Watabe, Shugo

    2018-01-01

    Ofunato Bay, in Japan, is the home of buoy-and-rope-type oyster aquaculture activities. Since the oysters filter suspended materials and excrete organic matter into the seawater, bacterial communities residing in its vicinity may show dynamic changes depending on the oyster culture activities. We employed a shotgun metagenomic technique to study bacterial communities near oyster aquaculture facilities at the center of the bay (KSt. 2) and compared the results with those of two other localities far from the station, one to the northeast (innermost bay, KSt. 1) and the other to the southwest (bay entrance, KSt. 3). Seawater samples were collected every month from January to December 2015 from the surface (1 m) and deeper (8 or 10 m) layers of the three locations, and the sequentially filtered fraction on 0.2-μm membranes was sequenced on an Illumina MiSeq system. The acquired reads were uploaded to MG-RAST for KEGG functional abundance analysis, while taxonomic analyses at the phylum and genus levels were performed using MEGAN after parsing the BLAST output. Discrimination analyses were then performed using the ROC-AUC value of the cross validation, targeting the depth (shallow or deep), locality [(KSt. 1 + KSt. 2) vs. KSt 3; (KSt. 1 + KSt. 3) vs. KSt. 2 or the (KSt. 2 + KSt. 3) vs. KSt. 1] and seasonality (12 months). The matrix discrimination analysis on the adjacent 2 continuous seasons by ROC-AUC, which was based on the datasets that originated from different depths, localities and months, showed the strongest discrimination signal on the taxonomy matrix at the phylum level for the datasets from July to August compared with those from September to June, while the KEGG matrix showed the strongest signal for the datasets from March to June compared with those from July to February. Then, the locality combination was subjected to the same ROC-AUC discrimination analysis, resulting in significant differences between KSt. 2 and KSt. 1 + KSt. 3

  16. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  17. Neurogenesis suggests independent evolution of opercula in serpulid polychaetes

    DEFF Research Database (Denmark)

    Brinkmann, Nora; Wanninger, Andreas

    2009-01-01

    BACKGROUND: The internal phylogenetic relationships of Annelida, one of the key lophotrochozoan lineages, are still heavily debated. Recent molecular analyses suggest that morphologically distinct groups, such as the polychaetes, are paraphyletic assemblages, thus questioning the homology...

  18. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  19. Adolescent computer use and alcohol use: what are the role of quantity and content of computer use?

    Science.gov (United States)

    Epstein, Jennifer A

    2011-05-01

    The purpose of this study was to examine the relationship between computer use and alcohol use among adolescents. In particular, the goal of the research was to determine the role of lifetime drinking and past month drinking on quantity as measured by amount of time on the computer (for school work and excluding school work) and on content as measured by the frequency of a variety of activities on the internet (e.g., e-mail, searching for information, social networking, listen to/download music). Participants (aged 13-17 years and residing in the United States) were recruited via the internet to complete an anonymous survey online using a popular survey tool (N=270). Their average age was 16 and the sample was predominantly female (63% girls). A series of analyses was conducted with the computer use measures as dependent variables (hours on the computer per week for school work and excluding school work; various internet activities including e-mail, searching for information, social networking, listen to/download music) controlling for gender, age, academic performance and age of first computer use. Based on the results, past month drinkers used the computer more hours per week excluding school work than those who did not. As expected, there were no differences in hours based on alcohol use for computer use for school work. Drinking also had relationships with more frequent social networking and listening to/downloading music. These findings suggest that both quantity and content of computer use were related to adolescent drinking. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Quality assurance requirements for the computer software and safety analyses

    International Nuclear Information System (INIS)

    Husarecek, J.

    1992-01-01

    The requirements are given as placed on the development, procurement, maintenance, and application of software for the creation or processing of data during the design, construction, operation, repair, maintenance and safety-related upgrading of nuclear power plants. The verification and validation processes are highlighted, and the requirements put on the software documentation are outlined. The general quality assurance principles applied to safety analyses are characterized. (J.B.). 1 ref

  1. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  2. RELAP5 thermal-hydraulic analyses of overcooling sequences in a pressurized water reactor

    International Nuclear Information System (INIS)

    Bolander, M.A.; Fletcher, C.D.; Davis, C.B.; Kullberg, C.M.; Stitt, B.D.; Waterman, M.E.; Burtt, J.D.

    1984-01-01

    In support of the Pressurized Thermal Shock Integration Study, sponsored by the United States Nuclear Regulatory Commission, the Idaho National Engineering Laboratory has performed analyses of overcooling transients using the RELAP5/MOD1.6 and MOD2.0 computer codes. These analyses were performed for the H.B. Robinson Unit 2 pressurized water reactor, which is a Westinghouse 3-loop design plant. Results of the RELAP5 analyses are presented. The capabilities of the RELAP5 computer code as a tool for analyzing integral plant transients requiring a detailed plant model, including complex trip logic and major control systems, are examined

  3. OECD-LOFT large break LOCA experiments: phenomenology and computer code analyses

    International Nuclear Information System (INIS)

    Brittain, I.; Aksan, S.N.

    1990-08-01

    Large break LOCA data from LOFT are a very important part of the world database. This paper describes the two double-ended cold leg break tests LP-02-6 and LP-LB-1 carried out within the OECD-LOFT Programme. Tests in LOFT were the first to show the importance of both bottom-up and top-down quenching during blowdown in removing stored energy from the fuel. These phenomena are discussed in detail, together with the related topics of the thermal performance of nuclear fuel and its simulation by electric fuel rod simulators, and the accuracy of cladding external thermocouples. The LOFT data are particularly important in the validation of integral thermal-hydraulics codes such as TRAC and RELAP5. Several OECD partner countries contributed analyses of the large break tests. Results of these analyses are summarised and some conclusions drawn. 32 figs., 3 tabs., 45 refs

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  5. Cloud Computing Law

    CERN Document Server

    Millard, Christopher

    2013-01-01

    This book is about the legal implications of cloud computing. In essence, ‘the cloud’ is a way of delivering computing resources as a utility service via the internet. It is evolving very rapidly with substantial investments being made in infrastructure, platforms and applications, all delivered ‘as a service’. The demand for cloud resources is enormous, driven by such developments as the deployment on a vast scale of mobile apps and the rapid emergence of ‘Big Data’. Part I of this book explains what cloud computing is and how it works. Part II analyses contractual relationships between cloud service providers and their customers, as well as the complex roles of intermediaries. Drawing on primary research conducted by the Cloud Legal Project at Queen Mary University of London, cloud contracts are analysed in detail, including the appropriateness and enforceability of ‘take it or leave it’ terms of service, as well as the scope for negotiating cloud deals. Specific arrangements for public sect...

  6. A Privacy-by-Design Contextual Suggestion System for Tourism

    NARCIS (Netherlands)

    Efraimidis, Pavlos; Drosatos, George; Arampatzis, Avi; Stamatelatos, Giorgos; Athanasiadis, Ioannis

    2016-01-01

    We focus on personal data generated by the sensors and through the everyday usage of smart devices and take advantage of these data to build a non-invasive contextual suggestion system for tourism. The system, which we call Pythia, exploits the computational capabilities of modern smart devices to

  7. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  8. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.

  9. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  10. SCALE-4 [Standardized Computer Analyses for Licensing Evaluation]: An improved computational system for spent-fuel cask analysis

    International Nuclear Information System (INIS)

    Parks, C.V.

    1989-01-01

    The purpose of this paper is to provide specific information regarding improvements available with Version 4.0 of the SCALE system and discuss the future of SCALE within the current computing and regulatory environment. The emphasis focuses on the improvements in SCALE-4 over that available in SCALE-3. 10 refs., 1 fig., 1 tab

  11. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    Directory of Open Access Journals (Sweden)

    Annie Chu

    2009-04-01

    Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

  12. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  13. Computer code selection criteria for flow and transport code(s) to be used in undisturbed vadose zone calculations for TWRS environmental analyses

    International Nuclear Information System (INIS)

    Mann, F.M.

    1998-01-01

    The Tank Waste Remediation System (TWRS) is responsible for the safe storage, retrieval, and disposal of waste currently being held in 177 underground tanks at the Hanford Site. In order to successfully carry out its mission, TWRS must perform environmental analyses describing the consequences of tank contents leaking from tanks and associated facilities during the storage, retrieval, or closure periods and immobilized low-activity tank waste contaminants leaving disposal facilities. Because of the large size of the facilities and the great depth of the dry zone (known as the vadose zone) underneath the facilities, sophisticated computer codes are needed to model the transport of the tank contents or contaminants. This document presents the code selection criteria for those vadose zone analyses (a subset of the above analyses) where the hydraulic properties of the vadose zone are constant in time the geochemical behavior of the contaminant-soil interaction can be described by simple models, and the geologic or engineered structures are complicated enough to require a two-or three dimensional model. Thus, simple analyses would not need to use the fairly sophisticated codes which would meet the selection criteria in this document. Similarly, those analyses which involve complex chemical modeling (such as those analyses involving large tank leaks or those analyses involving the modeling of contaminant release from glass waste forms) are excluded. The analyses covered here are those where the movement of contaminants can be relatively simply calculated from the moisture flow. These code selection criteria are based on the information from the low-level waste programs of the US Department of Energy (DOE) and of the US Nuclear Regulatory Commission as well as experience gained in the DOE Complex in applying these criteria. Appendix table A-1 provides a comparison between the criteria in these documents and those used here. This document does not define the models (that

  14. Computer Access and Computer Use for Science Performance of Racial and Linguistic Minority Students

    Science.gov (United States)

    Chang, Mido; Kim, Sunha

    2009-01-01

    This study examined the effects of computer access and computer use on the science achievement of elementary school students, with focused attention on the effects for racial and linguistic minority students. The study used the Early Childhood Longitudinal Study (ECLS-K) database and conducted statistical analyses with proper weights and…

  15. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  16. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  17. Development of SAGE, A computer code for safety assessment analyses for Korean Low-Level Radioactive Waste Disposal

    International Nuclear Information System (INIS)

    Zhou, W.; Kozak, Matthew W.; Park, Joowan; Kim, Changlak; Kang, Chulhyung

    2002-01-01

    This paper describes a computer code, called SAGE (Safety Assessment Groundwater Evaluation) to be used for evaluation of the concept for low-level waste disposal in the Republic of Korea (ROK). The conceptual model in the code is focused on releases from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. Doses can be calculated for several biosphere systems including drinking contaminated groundwater, and subsequent contamination of foods, rivers, lakes, or the ocean by that groundwater. The flexibility of the code will permit both generic analyses in support of design and site development activities, and straightforward modification to permit site-specific and design-specific safety assessments of a real facility as progress is made toward implementation of a disposal site. In addition, the code has been written to easily interface with more detailed codes for specific parts of the safety assessment. In this way, the code's capabilities can be significantly expanded as needed. The code has the capability to treat input parameters either deterministic ally or probabilistic ally. Parameter input is achieved through a user-friendly Graphical User Interface.

  18. Novel Algorithms for Astronomical Plate Analyses Rene Hudec1,2 ...

    Indian Academy of Sciences (India)

    2Czech Technical University in Prague, Faculty of Electrical Engineering, Technicka 2,. Prague 6 ... Abstract. Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We.

  19. Suggestibility and suggestive modulation of the Stroop effect.

    Science.gov (United States)

    Kirsch, Irving

    2011-06-01

    Although the induction of a hypnotic state does not seem necessary for suggestive modulation of the Stroop effect, this important phenomenon has seemed to be dependent on the subject's level of hypnotic suggestibility. Raz and Campbell's (2011) study indicates that suggestion can modulate the Stroop effect substantially in very low suggestible subjects, as well as in those who are highly suggestible. This finding casts doubt on the presumed mechanism by which suggestive modulation is brought about. Research aimed at uncovering the means by which low suggestible individuals are able to modulate the Stroop effect would be welcome, as would assessment of this effect in moderately suggestible people. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. Echinococcus cysticus of the liver - sonographic pattern suggestive of solid tumor

    International Nuclear Information System (INIS)

    Grosser, G.; Hauenstein, K.H.; Henke, W.

    1985-01-01

    In a patient with Hodgkin's disease, an intrahepatic echodense mass was diagnosed incidentally by ultrasonography. The sonographic pattern suggested a solid tumor. Despite negative or borderline serology, computed tomography establised the diagnosis of echinococcus cysticus by documentation of one ''daughter'' cyst; this diagnosis was confirmed by surgery. The criteria of echinococcus cysticus in modern imaging methods like sonography and computed tomography are summarized and the diagnostic value of various procedures including diagnostic procedure in seronegative cases are discussed. (orig.) [de

  1. Rapid Geometry Creation for Computer-Aided Engineering Parametric Analyses: A Case Study Using ComGeom2 for Launch Abort System Design

    Science.gov (United States)

    Hawke, Veronica; Gage, Peter; Manning, Ted

    2007-01-01

    ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.

  2. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    for NPP V-1 Bohunice and on review of the impact of the modelling of selected components to the results of calculation safety analysis (a sensitivity study for NPP Mochovce). In 2001 UJD joined a new European project Alternative Approaches to the Safety Performance Indicators. The project is aimed at the information collecting and determining of approaches and recommendations for implementation of the risk oriented indicators, identification of the impact of the safety culture level and organisational culture on safety and applying of indicators to the needs of regulators and operators. In frame of the PHARE project UJD participated in the task focused on severe accident mitigation for nuclear power plants with VVER-440/V213 units. The main results of the analyses of nuclear power plants responses to severe accidents were summarised and the state of their analytical base performed in the past was evaluated within the project. Possible severe accident mitigation and preventative measures were proposed and their applicability for the nuclear power plants with VVER-440/V213 was investigated. The obtained results will be used in assessment activities and accident management of UJD. UJD has been involved also in EVITA project which makes a part of the 5 th EC Framework Programme. The project aims at validation of the European computer code ASTEC dedicated for severe accidents modelling. In 2001 the ASTEC computer code was tested on different platforms. The results of the testing are summarised in the technical report of EC issued in September 2001. Further activities within this project were focused on performing of selected accident scenarios analyses and comparison of the obtained results with the analyses realised with the help of other computer codes. The work on the project will continue in 2002. In 2001 a groundwork on establishing the Centre for Nuclear Safety in Central and Eastern Europe (CENS), the seat of which is going to be in Bratislava, has continued. The

  3. Suggestibility and signal detection performance in hallucination-prone students.

    Science.gov (United States)

    Alganami, Fatimah; Varese, Filippo; Wagstaff, Graham F; Bentall, Richard P

    2017-03-01

    Auditory hallucinations are associated with signal detection biases. We examine the extent to which suggestions influence performance on a signal detection task (SDT) in highly hallucination-prone and low hallucination-prone students. We also explore the relationship between trait suggestibility, dissociation and hallucination proneness. In two experiments, students completed on-line measures of hallucination proneness (the revised Launay-Slade Hallucination Scale; LSHS-R), trait suggestibility (Inventory of Suggestibility) and dissociation (Dissociative Experiences Scale-II). Students in the upper and lower tertiles of the LSHS-R performed an auditory SDT. Prior to the task, suggestions were made pertaining to the number of expected targets (Experiment 1, N = 60: high vs. low suggestions; Experiment 2, N = 62, no suggestion vs. high suggestion vs. no voice suggestion). Correlational and regression analyses indicated that trait suggestibility and dissociation predicted hallucination proneness. Highly hallucination-prone students showed a higher SDT bias in both studies. In Experiment 1, both bias scores were significantly affected by suggestions to the same degree. In Experiment 2, highly hallucination-prone students were more reactive to the high suggestion condition than the controls. Suggestions may affect source-monitoring judgments, and this effect may be greater in those who have a predisposition towards hallucinatory experiences.

  4. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    International Nuclear Information System (INIS)

    Moinereau, D.; Faidy, C.; Valeta, M.P.; Bhandari, S.; Guichard, D.

    1997-01-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs

  5. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    Energy Technology Data Exchange (ETDEWEB)

    Moinereau, D [Electricite de France, Dept. MTC, Moret-sur-Loing (France); Faidy, C [Electricite de France, SEPTEN, Villeurbanne (France); Valeta, M P [Commisariat a l` Energie Atomique, Dept. DMT, Gif-sur-Yvette (France); Bhandari, S; Guichard, D [Societe Franco-Americaine de Constructions Atomiques (FRAMATOME), 92 - Paris-La-Defense (France)

    1997-09-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs.

  6. DMINDA: an integrated web server for DNA motif identification and analyses.

    Science.gov (United States)

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    Science.gov (United States)

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. How to analyse a Big Bang of data: the mammoth project at the Cern physics laboratory in Geneva to recreate the conditions immediately after the universe began requires computing power on an unprecedented scale

    CERN Multimedia

    Thomas, Kim

    2005-01-01

    How to analyse a Big Bang of data: the mammoth project at the Cern physics laboratory in Geneva to recreate the conditions immediately after the universe began requires computing power on an unprecedented scale

  9. An Overview of Computer security

    OpenAIRE

    Annam, Shireesh Reddy

    2001-01-01

    As more business activities are being automated and an increasing number of computers are being used to store vital and sensitive information the need for secure computer systems becomes more apparent. These systems can be achieved only through systematic design; they cannot be achieved through haphazard seat-of-the-pants methods.This paper introduces some known threats to the computer security, categorizes the threats, and analyses protection mechanisms and techniques for countering the thre...

  10. Echinococcus cysticus of the liver - sonographic pattern suggestive of solid tumor

    Energy Technology Data Exchange (ETDEWEB)

    Grosser, G.; Hauenstein, K.H.; Henke, W.

    1985-09-01

    In a patient with Hodgkin's disease, an intrahepatic echodense mass was diagnosed incidentally by ultrasonography. The sonographic pattern suggested a solid tumor. Despite negative or borderline serology, computed tomography establised the diagnosis of echinococcus cysticus by documentation of one ''daughter'' cyst; this diagnosis was confirmed by surgery. The criteria of echinococcus cysticus in modern imaging methods like sonography and computed tomography are summarized and the diagnostic value of various procedures including diagnostic procedure in seronegative cases are discussed.

  11. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  12. Prodeto, a computer code for probabilistic fatigue design

    Energy Technology Data Exchange (ETDEWEB)

    Braam, H [ECN-Solar and Wind Energy, Petten (Netherlands); Christensen, C J; Thoegersen, M L [Risoe National Lab., Roskilde (Denmark); Ronold, K O [Det Norske Veritas, Hoevik (Norway)

    1999-03-01

    A computer code for structural relibility analyses of wind turbine rotor blades subjected to fatigue loading is presented. With pre-processors that can transform measured and theoretically predicted load series to load range distributions by rain-flow counting and with a family of generic distribution models for parametric representation of these distribution this computer program is available for carying through probabilistic fatigue analyses of rotor blades. (au)

  13. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  14. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  15. Computer architecture technology trends

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in

  16. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  17. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  18. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  19. MAAP - modular program for analyses of severe accidents

    International Nuclear Information System (INIS)

    Henry, R.E.; Lutz, R.J.

    1990-01-01

    The MAAP computer code was developed by Westinghouse as a fast, user-friendly, integrated analytical tool for evaluations of the sequences and consequences of severe accidents. The code allows a fully integrated treatment of thermohydraulic behavior and of the fission products in the primary system, the containment, and the ancillary buildings. This ensures interactive inclusion of all thermohydraulic events and of fission product behavior. All important phenomena which may occur in a major accident are contained in the modular code. In addition, many of the important parameters affecting the multitude of different phenomena can be defined by the user. In this way, it is possible to study the accuracy of the predicted course and of the consequences of a series of major accident phenomena. The MAAP code was subjected to extensive benchmarking with respect to the results of the experimental and theoretical programs, the findings obtained in other safety analyses using computers and data from accidents and transients in plants actually in operation. With the expected connection of the validation and test programs, the computer code attains a quality standard meeting the most stringent requirements in safety analyses. The code will be enlarged further in order to expand the number of benchmarks and the resolution of individual comparisons, and to ensure that future MAAP models will be in better agreement with the experiments and experiences of industry. (orig.) [de

  20. Sensitivity of surface meteorological analyses to observation networks

    Science.gov (United States)

    Tyndall, Daniel Paul

    A computationally efficient variational analysis system for two-dimensional meteorological fields is developed and described. This analysis approach is most efficient when the number of analysis grid points is much larger than the number of available observations, such as for large domain mesoscale analyses. The analysis system is developed using MATLAB software and can take advantage of multiple processors or processor cores. A version of the analysis system has been exported as a platform independent application (i.e., can be run on Windows, Linux, or Macintosh OS X desktop computers without a MATLAB license) with input/output operations handled by commonly available internet software combined with data archives at the University of Utah. The impact of observation networks on the meteorological analyses is assessed by utilizing a percentile ranking of individual observation sensitivity and impact, which is computed by using the adjoint of the variational surface assimilation system. This methodology is demonstrated using a case study of the analysis from 1400 UTC 27 October 2010 over the entire contiguous United States domain. The sensitivity of this approach to the dependence of the background error covariance on observation density is examined. Observation sensitivity and impact provide insight on the influence of observations from heterogeneous observing networks as well as serve as objective metrics for quality control procedures that may help to identify stations with significant siting, reporting, or representativeness issues.

  1. The issue of gender within computing

    DEFF Research Database (Denmark)

    Robertson, Maxine; Newell, Sue; Swan, Jacky

    2001-01-01

    This paper explores some of the reasons that may underlie the gender segregation and declining levels of female participation within the field of computing in Europe during the 1990s in both the professional (industrial) and academic spheres. The interrelationships between three areas - communica......This paper explores some of the reasons that may underlie the gender segregation and declining levels of female participation within the field of computing in Europe during the 1990s in both the professional (industrial) and academic spheres. The interrelationships between three areas...... - communicative processes, social networks and legitimizing claims to knowledge overlaid by gendered-power relations - are used to analyse and explain the existing situation. The paper draws upon statistical data to explore the extent of gender segregation and then focuses on the authors' own experiences within...... the UK and Scandinavia in order to explore some of the underlying causes. While direct discrimination does still occur, the paper suggests that indirect, deep-rooted discrimination is the major reason for the situation that currently exists. Drawing upon our own experiences in academia and business...

  2. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    International Nuclear Information System (INIS)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries

  5. Computational analysis of EBNA1 ``druggability'' suggests novel insights for Epstein-Barr virus inhibitor design

    Science.gov (United States)

    Gianti, Eleonora; Messick, Troy E.; Lieberman, Paul M.; Zauhar, Randy J.

    2016-04-01

    The Epstein-Barr Nuclear Antigen 1 (EBNA1) is a critical protein encoded by the Epstein-Barr Virus (EBV). During latent infection, EBNA1 is essential for DNA replication and transcription initiation of viral and cellular genes and is necessary to immortalize primary B-lymphocytes. Nonetheless, the concept of EBNA1 as drug target is novel. Two EBNA1 crystal structures are publicly available and the first small-molecule EBNA1 inhibitors were recently discovered. However, no systematic studies have been reported on the structural details of EBNA1 "druggable" binding sites. We conducted computational identification and structural characterization of EBNA1 binding pockets, likely to accommodate ligand molecules (i.e. "druggable" binding sites). Then, we validated our predictions by docking against a set of compounds previously tested in vitro for EBNA1 inhibition (PubChem AID-2381). Finally, we supported assessments of pocket druggability by performing induced fit docking and molecular dynamics simulations paired with binding affinity predictions by Molecular Mechanics Generalized Born Surface Area calculations for a number of hits belonging to druggable binding sites. Our results establish EBNA1 as a target for drug discovery, and provide the computational evidence that active AID-2381 hits disrupt EBNA1:DNA binding upon interacting at individual sites. Lastly, structural properties of top scoring hits are proposed to support the rational design of the next generation of EBNA1 inhibitors.

  6. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  7. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  8. Partial-wave analyses of hadron scattering below 2 GeV

    International Nuclear Information System (INIS)

    Arndt, R.A.; Roper, L.D.

    1990-01-01

    The Center for Analysis of Particle Scattering (CAPS) in the Department of Physics at Virginia Polytechnic Institute and State University has analyzed basic two-body hadron reactions below 2 GeV for the last two decades. Reactions studied were nucleon-nucleon, pion-nucleon, K + -nucleon and pion photoproduction systems. In addition to analyses of these systems, a computer graphics system (SAID) has been developed and disseminated to over 200 research institutions using VAX computers. 8 refs

  9. Computer anxiety among university and college students majoring ...

    African Journals Online (AJOL)

    This study examined computer anxiety among university and college of education Physical and Health Education (PHE) majors. The influence of personal characteristics of gender, age and experience of PHE majors on computer anxiety level were analysed. The Computer Anxiety Scale (CAS) developed by Marcoulides ...

  10. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  11. Structural and functional cerebral correlates of hypnotic suggestibility.

    Directory of Open Access Journals (Sweden)

    Alexa Huber

    Full Text Available Little is known about the neural bases of hypnotic suggestibility, a cognitive trait referring to the tendency to respond to hypnotic suggestions. In the present magnetic resonance imaging study, we performed regression analyses to assess hypnotic suggestibility-related differences in local gray matter volume, using voxel-based morphometry, and in waking resting state functional connectivity of 10 resting state networks, in 37 healthy women. Hypnotic suggestibility was positively correlated with gray matter volume in portions of the left superior and medial frontal gyri, roughly overlapping with the supplementary and pre-supplementary motor area, and negatively correlated with gray matter volume in the left superior temporal gyrus and insula. In the functional connectivity analysis, hypnotic suggestibility was positively correlated with functional connectivity between medial posterior areas, including bilateral posterior cingulate cortex and precuneus, and both the lateral visual network and the left fronto-parietal network; a positive correlation was also found with functional connectivity between the executive-control network and a right postcentral/parietal area. In contrast, hypnotic suggestibility was negatively correlated with functional connectivity between the right fronto-parietal network and the right lateral thalamus. These findings demonstrate for the first time a correlation between hypnotic suggestibility, the structural features of specific cortical regions, and the functional connectivity during the normal resting state of brain structures involved in imagery and self-monitoring activity.

  12. Structural and functional cerebral correlates of hypnotic suggestibility.

    Science.gov (United States)

    Huber, Alexa; Lui, Fausta; Duzzi, Davide; Pagnoni, Giuseppe; Porro, Carlo Adolfo

    2014-01-01

    Little is known about the neural bases of hypnotic suggestibility, a cognitive trait referring to the tendency to respond to hypnotic suggestions. In the present magnetic resonance imaging study, we performed regression analyses to assess hypnotic suggestibility-related differences in local gray matter volume, using voxel-based morphometry, and in waking resting state functional connectivity of 10 resting state networks, in 37 healthy women. Hypnotic suggestibility was positively correlated with gray matter volume in portions of the left superior and medial frontal gyri, roughly overlapping with the supplementary and pre-supplementary motor area, and negatively correlated with gray matter volume in the left superior temporal gyrus and insula. In the functional connectivity analysis, hypnotic suggestibility was positively correlated with functional connectivity between medial posterior areas, including bilateral posterior cingulate cortex and precuneus, and both the lateral visual network and the left fronto-parietal network; a positive correlation was also found with functional connectivity between the executive-control network and a right postcentral/parietal area. In contrast, hypnotic suggestibility was negatively correlated with functional connectivity between the right fronto-parietal network and the right lateral thalamus. These findings demonstrate for the first time a correlation between hypnotic suggestibility, the structural features of specific cortical regions, and the functional connectivity during the normal resting state of brain structures involved in imagery and self-monitoring activity.

  13. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  14. Partial-wave analyses of hadron scattering below 2 GeV

    International Nuclear Information System (INIS)

    Arndt, R.A.; Roper, L.D.

    1991-01-01

    The Center for Analysis of Particle Scattering (CAPS) in the Department of Physics at Virginia Polytechnic Institute and State University has analyzed basic two-body hadron reactions below 2 GeV for the last two decades. Reactions studied were nucleon-nucleon, pion-nucleon, K + -nucleon and pion photoproduction systems. In addition to analyses of these systems, a computer graphics system (SAID) has been developed and disseminated to over 250 research institutions using VAX computers. The computer-interactive system for disseminating information on basic scattering reactions is also accessible to the physics community through TELNET on the VPI ampersand SU physics department VAX. 6 refs

  15. Two suggestions to ''improve the utilization of ISABELLE''

    International Nuclear Information System (INIS)

    Thorndike, A.

    1976-01-01

    Two suggestions are outlined which are aimed at improving the efficiency of work in experimental areas by improving the information available to experimenters. A very good communication system would make work as efficient as possible during times when the beam is off. Here are some ideas: (1) it should be a dedicated system that is always on or can be turned on from either position, and it should be impossible for the two ends to be on different channels; (2) tv is desirable so each individual can watch what the other is doing to avoid confusion; (3) slave CRT units would be desirable so both can watch a given waveform or other test signal; and (4) it should also be possible to monitor key voltages from either location. It seems reasonable that beam information would be stored in one or more files on a disc in the on-line computer system. Each experimenter's computer could then get whatever information was desired. Handling the information by computer is straightforward, and more or less standard systems for data-base management should be applicable. Having satisfactory sensors to monitor the information that is needed seems like more of a problem, but they are required in any case

  16. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  17. Age and interviewer behavior as predictors of interrogative suggestibility.

    Science.gov (United States)

    Dukala, Karolina; Polczyk, Romuald

    2014-05-01

    The main objective was to explore the influence of interviewer behavior-abrupt versus friendly-and the age of participants on interrogative suggestibility. The study involved 42 young adults and 50 elderly participants. The Gudjonsson Suggestibility Scale 2 was used. Data analysis involved a 2-factor between-subjects design (interviewer behavior × age) and mediation analysis. The scores of elderly participants were significantly lower than younger adults on memory indices and significantly higher on some suggestibility indexes. Some suggestibility indices in the abrupt experimental condition were higher than those in the friendly experimental condition. Elderly participants who were interviewed under the abrupt condition were more likely to change their answers after receiving negative feedback than younger adults. Memory quality was a mediator of the relationship between age and the tendency to yield to suggestive questions. Self-appraisal of memory was a mediator between both age and interviewer behavior and the tendency to change answers after negative feedback. Mechanisms of the relationship between age, interviewer behavior, and suggestibility are discussed on the basis of the mediational analyses. The findings suggest that a friendly manner should be adopted when interrogating witnesses.

  18. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  19. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Ihekwaba, Adoha

    2007-01-01

    A. Ihekwaba, R. Mardare. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems. Case study: NFkB system. In Proc. of International Conference of Computational Methods in Sciences and Engineering (ICCMSE), American Institute of Physics, AIP Proceedings, N 2...

  20. How do trees grow? Response from the graphical and quantitative analyses of computed tomography scanning data collected on stem sections.

    Science.gov (United States)

    Dutilleul, Pierre; Han, Li Wen; Beaulieu, Jean

    2014-06-01

    Tree growth, as measured via the width of annual rings, is used for environmental impact assessment and climate back-forecasting. This fascinating natural process has been studied at various scales in the stem (from cell and fiber within a growth ring, to ring and entire stem) in one, two, and three dimensions. A new approach is presented to study tree growth in 3D from stem sections, at a scale sufficiently small to allow the delineation of reliable limits for annual rings and large enough to capture directional variation in growth rates. The technology applied is computed tomography scanning, which provides - for one stem section - millions of data (indirect measures of wood density) that can be mapped, together with a companion measure of dispersion and growth ring limits in filigree. Graphical and quantitative analyses are reported for white spruce trees with circular vs non-circular growth. Implications for dendroclimatological research are discussed. Copyright © 2014 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  1. Safety analyses of the nuclear-powered ship Mutsu with RETRAN

    International Nuclear Information System (INIS)

    Naruko, Y.; Ishida, T.; Tanaka, Y.; Futamura, Y.

    1982-01-01

    To provide a quantitative basis for the safety evaluation of the N.S. Mutsu, a number of safety analyses were performed in the course of reexamination. With respect to operational transient analyses, the RETRAN computer code was used to predict plant performances on the basis of postulated transient scenarios. The COBRA-IV computer code was also used to obtain a value of the minimum DNBR for each transient, which is necessary to predict detailed thermal-hydraulic performances in the core region of the reactor. In the present paper, the following three operational transients, which were calculated as a part of the safety analyses, are being dealt with: a complete loss of load without reactor scram; an excessive load increase incident, which is followed by a 30 percent stepwise load increase in the steam dump flow; and an accidental depressurization of the primary system, which is followed by a sudden full opening of the pressurizer spray valve. A Mutsu two-loop RETRAN model and simulation results were described. The results being compared with those of land-based PWRs, the characteristic features of the Mutsu reactor were presented and the safety of the plant under the operational transient conditions was confirmed

  2. Preserving and reusing high-energy-physics data analyses

    CERN Document Server

    Simko, Tibor; Dasler, Robin; Fokianos, Pamfilos; Kuncar, Jiri; Lavasa, Artemis; Mattmann, Annemarie; Rodriguez, Diego; Trzcinska, Anna; Tsanaktsidis, Ioannis

    2017-01-01

    The revalidation, reuse and reinterpretation of data analyses require having access to the original virtual environments, datasets and software that was used to produce the original scientific result. The CERN Analysis Preservation pilot project is developing a set of tools that support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. In this talk, we shall notably focus on the aspects of reusing a preserved analysis. We describe a system that permits to instantiate the preserved analysis workflow on the computing cloud, paving the way to allowing researchers to revalidate and reinterpret research data even many years after the original publication.

  3. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  4. Finite element analyses of a linear-accelerator electron gun

    Science.gov (United States)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-02-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  5. Finite element analyses of a linear-accelerator electron gun

    Energy Technology Data Exchange (ETDEWEB)

    Iqbal, M., E-mail: muniqbal.chep@pu.edu.pk, E-mail: muniqbal@ihep.ac.cn [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Wasy, A. [Department of Mechanical Engineering, Changwon National University, Changwon 641773 (Korea, Republic of); Islam, G. U. [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Zhou, Z. [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China)

    2014-02-15

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  6. Finite element analyses of a linear-accelerator electron gun

    International Nuclear Information System (INIS)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-01-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator

  7. Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

    Directory of Open Access Journals (Sweden)

    Ilyena Hirskyj-Douglas

    2018-06-01

    Full Text Available As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploration.

  8. Examination of concept of next generation computer. Progress report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Hasegawa, Yukihiro; Hirayama, Toshio

    2000-12-01

    The Center for Promotion of Computational Science and Engineering has conducted R and D works on the technology of parallel processing and has started the examination of the next generation computer in 1999. This report describes the behavior analyses of quantum calculation codes. It also describes the consideration for the analyses and examination results for the method to reduce cash misses. Furthermore, it describes a performance simulator that is being developed to quantitatively examine the concept of the next generation computer. (author)

  9. Use of the modal superposition technique for piping system blowdown analyses

    International Nuclear Information System (INIS)

    Ware, A.G.; Macek, R.W.

    1983-01-01

    A standard method of solving for the seismic response of piping systems is the modal superposition technique. Only a limited number of structural modes are considered (typically those up to 33 Hz in the U.S.), since the effect on the calculated response due to higher modes is generally small, and the method can result in considerable computer cost savings over the direct integration method. The modal superposition technique has also been applied to piping response problems in which the forcing functions are due to fluid excitation. Application of the technique to this case is somewhat more difficult, because a well defined cutoff frequency for determining structural modes to be included has not been established. This paper outlines a method for higher mode corrections, and suggests methods to determine suitable cutoff frequencies for piping system blowdown analyses. A numerical example illustrates how uncorrected modal superposition results can produce erroneous stress results

  10. Structural integrity analyses: can we manage the advances?

    International Nuclear Information System (INIS)

    Sauve, R.

    2006-01-01

    Engineering has been one of a number of disciplines in which significant advances in analysis procedures has taken place in the last two decades. In particular, advances in computer technology and engineering software have revolutionized the assessment of component structural integrity for a wide range of applications. A significant development in computational mechanics directly related to computer technology that has had a profound impact on the field of structural integrity is the finite element method. The finite element method has re-defined and expanded the role of structural integrity assessments by providing comprehensive modelling capabilities to engineers involved in design and failure analyses. As computer processing speeds and capacity have increased, so has the role of computer modelling in assessments of component structural integrity. With new product development cycles shrinking, the role of initial testing is being reduced in favour of computer modelling and simulation to assess component life and durability. For ageing structures, the evaluation of remaining life and the impact of degraded structural integrity becomes tractable with the modern advances in computational methods. The areas of structural integrity that have derived great benefit from the advances in numerical techniques include stress analysis, fracture mechanics, dynamics, heat transfer, structural reliability, probabilistic methods and continuum mechanics in general. One of the salient features of the current methods is the ability to handle large complex steady state or transient dynamic problems that exhibit highly non-linear behaviour. With the ever-increasing usage of these advanced methods, the question is posed: Can we manage the advances? Better still are we managing the advances? As with all technological advances that enter mainstream use, comes the need for education, training and certification in the application of these methods, improved quality assurance procedures and

  11. Computer control of shielded cell operations

    International Nuclear Information System (INIS)

    Jeffords, W.R. III.

    1987-01-01

    This paper describes in detail a computer system to remotely control shielded cell operations. System hardware, software, and design criteria are discussed. We have designed a computer-controlled buret that provides a tenfold improvement over the buret currently in service. A computer also automatically controls cell analyses, calibrations, and maintenance. This system improves conditions for the operators by providing a safer, more efficient working environment and is expandable for future growth and development

  12. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn

    2009-01-01

    An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design that add......An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design...... that addresses the shaping of interactions among people, place-specific resources and global socio-technical networks, mediated by digital technology, and influenced by the structuring conditions of place. The theoretical grounding for place-specific computing is located in the meeting between conceptions...... of place in human geography and recent research in interaction design focusing on embodied interaction. Central themes in this grounding revolve around place and its relation to embodiment and practice, as well as the social, cultural and material aspects conditioning the enactment of place. Selected...

  13. Break spectrum analyses for small break loss of coolant accidents in a RESAR-3S Plant

    International Nuclear Information System (INIS)

    Fletcher, C.D.; Kullberg, C.M.

    1986-03-01

    A series of thermal-hydraulic analyses were performed to investigate phenomena occurring during small break loss-of-coolant-accident (LOCA) sequences in a RESAR-3S pressurized water reactor. The analysis included simulations of plant behavior using the TRAC-PF1 and RELAP5/MOD2 computer codes. Series of calculations were performed using both codes for different break sizes. The analyses presented here also served an audit function in that the results shown here were used by the US Nuclear Regulatory Commission (NRC) as an independent confirmation of similar analyses performed by Westinghouse Electric Company using another computer code. 10 refs., 62 figs., 14 tabs

  14. Computer Skills Training and Readiness to Work with Computers

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2016-05-01

    Full Text Available In today’s job market, computer skills are part of the prerequisites for many jobs. In this paper, we report on a study of readiness to work with computers (the dependent variable among unemployed women (N=54 after participating in a unique, web-supported training focused on computer skills and empowerment. Overall, the level of participants’ readiness to work with computers was much higher at the end of the course than it was at its begin-ning. During the analysis, we explored associations between this variable and variables from four categories: log-based (describing the online activity; computer literacy and experience; job-seeking motivation and practice; and training satisfaction. Only two variables were associated with the dependent variable: knowledge post-test duration and satisfaction with content. After building a prediction model for the dependent variable, another log-based variable was highlighted: total number of actions in the course website along the course. Overall, our analyses shed light on the predominance of log-based variables over variables from other categories. These findings might hint at the need of developing new assessment tools for learners and trainees that take into consideration human-computer interaction when measuring self-efficacy variables.

  15. Omeups: an interactive graphics program for analysing collision data

    International Nuclear Information System (INIS)

    Burgess, A.; Mason, H.E.; Tully, J.A.

    1991-01-01

    The aim of the micro-computer program OMEUPS is to provide a simple means of critically assessing and compacting collision strength data for electron impact excitation of positive ions. The program is interactive and allows data to be analysed graphically: it should be of particular interest to astrophysicists as well as to those specialising in atomic physics. The method on which the program is based allows one to interpolate or extrapolate existing data in energy and temperature; store data in compact form without losing significant information; perform Maxwell averaging; detect printing and computational errors in tabulated data

  16. Beyond the Computer Literacy.

    Science.gov (United States)

    Streibel, Michael J.; Garhart, Casey

    1985-01-01

    Describes the approach taken in an education computing course for pre- and in-service teachers. Outlines the basic operational, analytical, and evaluation skills that are emphasized in the course, suggesting that these skills go beyond the attainment of computer literacy and can assist in the effective use of computers. (ML)

  17. Hypnosis, suggestion, and suggestibility: an integrative model.

    Science.gov (United States)

    Lynn, Steven Jay; Laurence, Jean-Roch; Kirsch, Irving

    2015-01-01

    This article elucidates an integrative model of hypnosis that integrates social, cultural, cognitive, and neurophysiological variables at play both in and out of hypnosis and considers their dynamic interaction as determinants of the multifaceted experience of hypnosis. The roles of these variables are examined in the induction and suggestion stages of hypnosis, including how they are related to the experience of involuntariness, one of the hallmarks of hypnosis. It is suggested that studies of the modification of hypnotic suggestibility; cognitive flexibility; response sets and expectancies; the default-mode network; and the search for the neurophysiological correlates of hypnosis, more broadly, in conjunction with research on social psychological variables, hold much promise to further understanding of hypnosis.

  18. The Dynamic Geometrisation of Computer Programming

    Science.gov (United States)

    Sinclair, Nathalie; Patterson, Margaret

    2018-01-01

    The goal of this paper is to explore dynamic geometry environments (DGE) as a type of computer programming language. Using projects created by secondary students in one particular DGE, we analyse the extent to which the various aspects of computational thinking--including both ways of doing things and particular concepts--were evident in their…

  19. Analyses of Receptive and Productive Korean EFL Vocabulary: Computer-Based Vocabulary Learning Program

    Science.gov (United States)

    Kim, Scott Sungki

    2013-01-01

    The present research study investigated the effects of 8 versions of a computer-based vocabulary learning program on receptive and productive knowledge levels of college students. The participants were 106 male and 103 female Korean EFL students from Kyungsung University and Kwandong University in Korea. Students who participated in versions of…

  20. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  1. Comparative analyses suggest that information transfer promoted sociality in male bats in the temperate zone.

    Science.gov (United States)

    Safi, Kamran; Kerth, Gerald

    2007-09-01

    The evolution of sociality is a central theme in evolutionary biology. The vast majority of bats are social, which has been explained in terms of the benefits of communal breeding. However, the causes for segregated male groups remain unknown. In a comparative study, we tested whether diet and morphological adaptations to specific foraging styles, two factors known to influence the occurrence of information transfer, can predict male sociality. Our results suggest that the species most likely to benefit from information transfer--namely, those preying on ephemeral insects and with morphological adaptations to feeding in open habitat--are more likely to form male groups. Our findings also indicate that solitary life was the ancestral state of males and sociality evolved in several lineages. Beyond their significance for explaining the existence of male groups in bats, our findings highlight the importance of information transfer in the evolution of animal sociality.

  2. Seismic response analyses for reactor facilities at Savannah River

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Xu, J.

    1991-01-01

    The reactor facilities at the Savannah River Plant (SRP) were designed during the 1950's. The original seismic criteria defining the input ground motion was 0.1 G with UBC [uniform building code] provisions used to evaluate structural seismic loads. Later ground motion criteria have defined the free field seismic motion with a 0.2 G ZPA [free field acceleration] and various spectral shapes. The spectral shapes have included the Housner spectra, a site specific spectra, and the US NRC [Nuclear Regulatory Commission] Reg. Guide 1.60 shape. The development of these free field seismic criteria are discussed in the paper. The more recent seismic analyses have been of the following type: fixed base response spectra, frequency independent lumped parameter soil/structure interaction (SSI), frequency dependent lumped parameter SSI, and current state of the art analyses using computer codes such as SASSI. The results from these computations consist of structural loads and floor response spectra (used for piping and equipment qualification). These results are compared in the paper and the methods used to validate the results are discussed. 14 refs., 11 figs

  3. Computer aided system engineering for space construction

    Science.gov (United States)

    Racheli, Ugo

    1989-01-01

    This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.

  4. Association of achondroplasia with Down syndrome: difficulty in prenatal diagnosis by sonographic and 3-D helical computed tomographic analyses.

    Science.gov (United States)

    Kaga, Akimune; Murotsuki, Jun; Kamimura, Miki; Kimura, Masato; Saito-Hakoda, Akiko; Kanno, Junko; Hoshi, Kazuhiko; Kure, Shigeo; Fujiwara, Ikuma

    2015-05-01

    Achondroplasia and Down syndrome are relatively common conditions individually. But co-occurrence of both conditions in the same patient is rare and there have been no reports of fetal analysis of this condition by prenatal sonographic and three-dimensional (3-D) helical computed tomography (CT). Prenatal sonographic findings seen in persons with Down syndrome, such as a thickened nuchal fold, cardiac defects, and echogenic bowel were not found in the patient. A prenatal 3-D helical CT revealed a large head with frontal bossing, metaphyseal flaring of the long bones, and small iliac wings, which suggested achondroplasia. In a case with combination of achondroplasia and Down syndrome, it may be difficult to diagnose the co-occurrence prenatally without typical markers of Down syndrome. © 2014 Japanese Teratology Society.

  5. Computer Games in Pre-School Settings: Didactical Challenges when Commercial Educational Computer Games Are Implemented in Kindergartens

    Science.gov (United States)

    Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune

    2012-01-01

    This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…

  6. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  7. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  8. Analysis of existing risk assessments, and list of suggestions

    CERN Document Server

    Heimsch, Laura

    2016-01-01

    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  9. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  10. Accident and safety analyses for the HTR-modul. Partial project 1: Computer codes for system behaviour calculation. Final report. Pt. 2

    International Nuclear Information System (INIS)

    Lohnert, G.; Becker, D.; Dilcher, L.; Doerner, G.; Feltes, W.; Gysler, G.; Haque, H.; Kindt, T.; Kohtz, N.; Lange, L.; Ragoss, H.

    1993-08-01

    The project encompasses the following project tasks and problems: (1) Studies relating to complete failure of the main heat transfer system; (2) Pebble flow; (3) Development of computer codes for detailed calculation of hypothetical accidents; (a) the THERMIX/RZKRIT temperature buildup code (covering a.o. a variation to include exothermal heat sources); (b) the REACT/THERMIX corrosion code (variation taking into account extremely severe air ingress into the primary loop); (c) the GRECO corrosion code (variation for treating extremely severe water ingress into the primary loop); (d) the KIND transients code (for treating extremely fast transients during reactivity incidents. (4) Limiting devices for safety-relevant quantities. (5) Analyses relating to hypothetical accidents. (a) hypothetical air ingress; (b) effects on the fuel particles induced by fast transients. The problems of the various tasks are defined in detail and the main results obtained are explained. The contributions reporting the various project tasks and activities have been prepared for separate retrieval from the database. (orig./HP) [de

  11. Accident and safety analyses for the HTR-modul. Partial project 1: Computer codes for system behaviour calculation. Final report. Pt. 1

    International Nuclear Information System (INIS)

    Lohnert, G.; Becker, D.; Dilcher, L.; Doerner, G.; Feltes, W.; Gysler, G.; Haque, H.; Kindt, T.; Kohtz, N.; Lange, L.; Ragoss, H.

    1993-08-01

    The project encompasses the following project tasks and problems: (1) Studies relating to complete failure of the main heat transfer system; (2) Pebble flow; (3) Development of computer codes for detailed calculation of hypothetical accidents; (a) the THERMIX/RZKRIT temperature buildup code (covering a.o. a variation to include exothermal heat sources); (b) the REACT/THERMIX corrosion code (variation taking into account extremely severe air ingress into the primary loop); (c) the GRECO corrosion code (variation for treating extremely severe water ingress into the primary loop); (d) the KIND transients code (for treating extremely fast transients during reactivity incidents. (4) Limiting devices for safety-relevant quantities. (5) Analyses relating to hypothetical accidents. (a) hypothetical air ingress; (b) effects on the fuel particles induced by fast transients. The problems of the various tasks are defined in detail and the main results obtained are explained. The contributions reporting the various project tasks and activities have been prepared for separate retrieval from the database. (orig./HP) [de

  12. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  13. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  14. Elastic meson-nucleon partial wave scattering analyses

    International Nuclear Information System (INIS)

    Arndt, R.A.

    1986-01-01

    Comprehensive analyses of π-n elastic scattering data below 1100 MeV(Tlab), and K+p scattering below 3 GeV/c(Plab) are discussed. Also discussed is a package of computer programs and data bases (scattering data, and solution files) through which users can ''explore'' these interactions in great detail; this package is known by the acronym SAID (for Scattering Analysis Interactive Dialin) and is accessible on VAX backup tapes, or by dialin to the VPI computers. The π-n, and k+p interactions will be described as seen through the SAID programs. A procedure will be described for generating an interpolating array from any of the solutions encoded in SAID; this array can then be used through a fortran callable subroutine (supplied as part of SAID) to give excellent amplitude reconstructions over a broad kinematic range

  15. Sentinel-1 data massive processing for large scale DInSAR analyses within Cloud Computing environments through the P-SBAS approach

    Science.gov (United States)

    Lanari, Riccardo; Bonano, Manuela; Buonanno, Sabatino; Casu, Francesco; De Luca, Claudio; Fusco, Adele; Manunta, Michele; Manzo, Mariarosaria; Pepe, Antonio; Zinno, Ivana

    2017-04-01

    -core programming techniques. Currently, Cloud Computing environments make available large collections of computing resources and storage that can be effectively exploited through the presented S1 P-SBAS processing chain to carry out interferometric analyses at a very large scale, in reduced time. This allows us to deal also with the problems connected to the use of S1 P-SBAS chain in operational contexts, related to hazard monitoring and risk prevention and mitigation, where handling large amounts of data represents a challenging task. As a significant experimental result we performed a large spatial scale SBAS analysis relevant to the Central and Southern Italy by exploiting the Amazon Web Services Cloud Computing platform. In particular, we processed in parallel 300 S1 acquisitions covering the Italian peninsula from Lazio to Sicily through the presented S1 P-SBAS processing chain, generating 710 interferograms, thus finally obtaining the displacement time series of the whole processed area. This work has been partially supported by the CNR-DPC agreement, the H2020 EPOS-IP project (GA 676564) and the ESA GEP project.

  16. Analysis of accidents in uranium mines and suggestions on safety in production

    International Nuclear Information System (INIS)

    Xue Shiqian.

    1989-01-01

    The serious and fatal accidents happening in the uranium mines in China are descibed and analysed based on the classification, cause, age of the dead and economic losses brought by the accidents. The suggestions on safety in production are also presented

  17. Effect size calculation in meta-analyses of psychotherapy outcome research.

    Science.gov (United States)

    Hoyt, William T; Del Re, A C

    2018-05-01

    Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.

  18. Prognostic significance of tumor size of small lung adenocarcinomas evaluated with mediastinal window settings on computed tomography.

    Directory of Open Access Journals (Sweden)

    Yukinori Sakao

    .60, 0.81, 0.81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. CONCLUSIONS: According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma.

  19. Prognostic Significance of Tumor Size of Small Lung Adenocarcinomas Evaluated with Mediastinal Window Settings on Computed Tomography

    Science.gov (United States)

    Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae

    2014-01-01

    .81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Conclusions According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma. PMID:25365326

  20. Coalescent Modelling Suggests Recent Secondary-Contact of Cryptic Penguin Species.

    Science.gov (United States)

    Grosser, Stefanie; Burridge, Christopher P; Peucker, Amanda J; Waters, Jonathan M

    2015-01-01

    Molecular genetic analyses present powerful tools for elucidating demographic and biogeographic histories of taxa. Here we present genetic evidence showing a dynamic history for two cryptic lineages within Eudyptula, the world's smallest penguin. Specifically, we use a suite of genetic markers to reveal that two congeneric taxa ('Australia' and 'New Zealand') co-occur in southern New Zealand, with only low levels of hybridization. Coalescent modelling suggests that the Australian little penguin only recently expanded into southern New Zealand. Analyses conducted under time-dependent molecular evolutionary rates lend support to the hypothesis of recent anthropogenic turnover, consistent with shifts detected in several other New Zealand coastal vertebrate taxa. This apparent turnover event highlights the dynamic nature of the region's coastal ecosystem.

  1. Computer assisted holographic moire contouring

    Science.gov (United States)

    Sciammarella, Cesar A.

    2000-01-01

    Theoretical analyses and experimental results on holographic moire contouring on diffusely reflecting objects are presented. The sensitivity and limitations of the method are discussed. Particular emphasis is put on computer-assisted data retrieval, processing, and recording.

  2. STEADY-SHIP: a computer code for three-dimensional nuclear and thermal-hydraulic analyses of marine reactors

    International Nuclear Information System (INIS)

    Itagaki, Masafumi; Naito, Yoshitaka; Tokuno, Yukio; Matsui, Yasushi.

    1988-01-01

    A code STEADY-SHIP has been developed to calculate three-dimensional distributions of neutron flux, power and coolant temperature in the reactor core of the nuclear ship MUTSU. The code consists of two parts, that is, a few-group three-dimensional neutron diffusion module DIFFUSION-SHIP and a thermal-hydraulic module HYDRO-SHIP: In the DIFFUSION-SHIP the leakage iteration method is used for solving the three-dimensional neutron diffusion equation with small computer core memory and short computing time; The HYDRO-SHIP performs the general thermal-hydraulic calculation for evaluating feedbacks required in the neutronic calculation by the DIFFUSION-SHIP. The macroscopic nuclear constants are generated by a module CROSS-SHIP as functions of xenon poison, fuel temperature, moderator temperature and moderator density. A module LOCAL-FINE has the capability of computing a detailed rod power distribution for each local node in the core, using the boundary conditions on the surface of the node which were supplied by the STEADY-SHIP whole-core calculation. The applicability of this code to marine reactors has been demonstrated by comparing the computed results with the data measured during the MUTSU land-loaded core critical experiments and with the data obtained during the hot-zero-power tests performed for the actual MUTSU plant. (author)

  3. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  4. Computed Tomography and Computed Radiography of late Bronze Age Cremation Urns from Denmark

    DEFF Research Database (Denmark)

    Harvig, Lise Lock; Lynnerup, Niels; Amsgaard Ebsen, Jannie

    2012-01-01

    To improve methods used to study prehistoric cremation rituals, cremation urns from the Danish late Bronze Age were examined using Computed Tomography and Computed Radiography (Digital X-ray). During microexcavation, the digital images were used as registration tool. Our results suggest...

  5. Computational and experimental investigation of dynamic shock reflection phenomena

    CSIR Research Space (South Africa)

    Naidoo, K

    2007-07-01

    Full Text Available wedge are used to analyse dynamic flow field phenomena and response of the triple point below and within the dual solution domain. Computed, unsteady pressure traces on the reflection plane are also analysed...

  6. Bayesian computation with R

    CERN Document Server

    Albert, Jim

    2009-01-01

    There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry. Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The earl

  7. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data

    OpenAIRE

    Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.

    2010-01-01

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, of...

  8. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR.

  9. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    International Nuclear Information System (INIS)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR

  10. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    Science.gov (United States)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  11. Computer code qualification program for the Advanced CANDU Reactor

    International Nuclear Information System (INIS)

    Popov, N.K.; Wren, D.J.; Snell, V.G.; White, A.J.; Boczar, P.G.

    2003-01-01

    Atomic Energy of Canada Ltd (AECL) has developed and implemented a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper provides an overview of the computer programs used in Advanced CANDU Reactor (ACR) safety analysis, and assessment of their applicability in the safety analyses of the ACR design. An outline of the incremental validation program, and an overview of the experimental program in support of the code validation are also presented. An outline of the SQA program used to qualify these computer codes is also briefly presented. To provide context to the differences in the SQA with respect to current CANDUs, the paper also provides an overview of the ACR design features that have an impact on the computer code qualification. (author)

  12. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  13. Examining Computer Gaming Addiction in Terms of Different Variables

    Science.gov (United States)

    Kurt, Adile Askim; Dogan, Ezgi; Erdogmus, Yasemin Kahyaoglu; Emiroglu, Bulent Gursel

    2018-01-01

    The computer gaming addiction is one of the newer concepts that young generations face and can be defined as the excessive and problematic use of computer games leading to social and/or emotional problems. The purpose of this study is to analyse through variables the computer gaming addiction levels of secondary school students. The research was…

  14. FEM effective suggestion of guitar construction

    Directory of Open Access Journals (Sweden)

    Vladimír Dániel

    2006-01-01

    Full Text Available Modal analysis of the whole guitar construction was performed. The results of eigenfrequencies were obtained. Stress in strings affects not only static loading of material, but also shift of eigenfrequencies. From obtained natural frequencies for solved spectrum such frequencies were used which coincides with assumed ribs new positions of ribs were suggested. Other ribs which do not carry out the mechanical function were removed. Also static reaction was evaluated and new position of ribs was adjusted. For final model new eigenfrequencies were computed and compared with previous ones. Significant changes were revealed in low frequencies (bellow 400 Hz where fewer amounts of natural shapes were obtained. Approximately 50% were lost by adding of ribs. For chosen frequencies of equal temperament the harmonic analysis was performed. The analysis proved ability of oscillation for frequencies far of natural frequencies. The final model satisfies the requirement of minimization of static stress in material due to strings and allows very effective oscillation of top the guitar resonance board. In comparison with literature good agreement in amplitude size of front board and amount of modes in appropriate frequencies were achieved. Suggested model even offers higher amount of natural shapes in comparison with literature, namely in high frequencies. From additional comparison of eigenfrequencies and natural shapes the influence of ribs position on natural shapes was approved.

  15. Computational force, mass, and energy

    International Nuclear Information System (INIS)

    Numrich, R.W.

    1997-01-01

    This paper describes a correspondence between computational quantities commonly used to report computer performance measurements and mechanical quantities from classical Newtonian mechanics. It defines a set of three fundamental computational quantities that are sufficient to establish a system of computational measurement. From these quantities, it defines derived computational quantities that have analogous physical counterparts. These computational quantities obey three laws of motion in computational space. The solutions to the equations of motion, with appropriate boundary conditions, determine the computational mass of the computer. Computational forces, with magnitudes specific to each instruction and to each computer, overcome the inertia represented by this mass. The paper suggests normalizing the computational mass scale by picking the mass of a register on the CRAY-1 as the standard unit of mass

  16. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  17. X-ray optical analyses with X-Ray Absorption Package (XRAP)

    International Nuclear Information System (INIS)

    Wang, Zhibi; Kuzay, T.M.; Dejus, R.; Grace, T.

    1994-01-01

    This paper presents an X-Ray Absorption Package (XRAP) and the theoretical background for this program. XRAP is a computer code developed for analysis of optical elements in synchrotron radiation facilities. Two main issues are to be addressed: (1) generating BM (bending magnet) and ID (insertion device) spectrum and calculating their absorption in media, especially in such structural forms as variable thickness windows/filters and crystals; and (2) providing a finite difference engine for fast but sophisticated thermal and stress analyses for optical elements, such as windows and filters. Radiation cooling, temperature-dependent material properties (such as thermal conductivity and thermal expansion coefficient) etc. are taken into account in the analyses. For very complex geometry, an interface is provided directly to finite element codes such as ANSYS. Some of the present features built into XRAP include: (1) generation of BM and ID spectra; (2) photon absorption analysis of optical elements including filters, windows and mirrors, etc.; (3) heat transfer and thermal stress analyses of windows and filters and their buckling check; (4) user-friendly graphical-interface that is based on the state-of-the-art technology of GUI and X-window systems, which can be easily ported to other computer platforms; (5) postscript file output of either black/white or colored graphics for total/absorbed power, temperature, stress, spectra, etc

  18. Application of the Ssub(n)-method for reactors computations on BESM-6 computer by using 26-group constants in the sub-group presentation

    International Nuclear Information System (INIS)

    Rogov, A.D.

    1975-01-01

    Description of the computer program for reactor computation by application of the Ssub(n)-method in the two-dimensional XY and RZ geometries is given. These programs are used with application of the computer library of 26- group constats system taking into account the resonance structure of the cross sections in the subgroup presentation. Results of some systems computations are given and the results obtained are analysed. (author)

  19. Computational analyses of synergism in small molecular network motifs.

    Directory of Open Access Journals (Sweden)

    Yili Zhang

    2014-03-01

    Full Text Available Cellular functions and responses to stimuli are controlled by complex regulatory networks that comprise a large diversity of molecular components and their interactions. However, achieving an intuitive understanding of the dynamical properties and responses to stimuli of these networks is hampered by their large scale and complexity. To address this issue, analyses of regulatory networks often focus on reduced models that depict distinct, reoccurring connectivity patterns referred to as motifs. Previous modeling studies have begun to characterize the dynamics of small motifs, and to describe ways in which variations in parameters affect their responses to stimuli. The present study investigates how variations in pairs of parameters affect responses in a series of ten common network motifs, identifying concurrent variations that act synergistically (or antagonistically to alter the responses of the motifs to stimuli. Synergism (or antagonism was quantified using degrees of nonlinear blending and additive synergism. Simulations identified concurrent variations that maximized synergism, and examined the ways in which it was affected by stimulus protocols and the architecture of a motif. Only a subset of architectures exhibited synergism following paired changes in parameters. The approach was then applied to a model describing interlocked feedback loops governing the synthesis of the CREB1 and CREB2 transcription factors. The effects of motifs on synergism for this biologically realistic model were consistent with those for the abstract models of single motifs. These results have implications for the rational design of combination drug therapies with the potential for synergistic interactions.

  20. Analysis of Characteristics of Lateral Stability of Sailplane Lak 17a by Computational Method

    Directory of Open Access Journals (Sweden)

    Paulius Gildutis

    2011-04-01

    Full Text Available A computer-based geometrical model of the Lak-17a sailplane was generated with the program AVL (Athena Vortex Lattice, which is designed to analyse the characteristics of flight and to provide a rapid analysis of the configuration of an aircraft. Various characteristics of stability and control were calculated by simulating a real flight with the program. According to the results, a conclusion was formulated and suggestions about how to improve the stability and control of the aircraft were offered. Article in Lithuanian

  1. Analyses of musculoskeletal interactions in humans by quantitative computed tomography (QCT)

    International Nuclear Information System (INIS)

    Capiglioni, Ricardo; Cointry, Gustavo; Capozza, Ricardo; Gimenez, Carlos; Ferretti, Jose L.

    2001-01-01

    Bone and muscle cross-sectional properties were assessed by QCT at the L3 spinal level in normal women and men (n=93/5) aged 32-74 years and compared with the kyphosis angle (Ka) determined between T4 and T12 in lateral Rx's. The volumetric mineral density (vBMD) of trabecular bone, the bone mineral content (BMC) of the vertebral bodies and the fat-free areas of the peri spinal muscle (FFMA) varied in line and correlated negatively with the Ka. Multiple regression analyses showed that the trabecular vBMD and total BMC were the most significant independent determinants of the Ka, and that the FFMA and time since menopause were the only independent determinants of the bone properties, with no influence of the gender, age or anthropometric factors. (author)

  2. Clinical diagnosis and computer analysis of headache symptoms.

    OpenAIRE

    Drummond, P D; Lance, J W

    1984-01-01

    The headache histories obtained from clinical interviews of 600 patients were analysed by computer to see whether patients could be separated systematically into clinical categories and to see whether sets of symptoms commonly reported together differed in distribution among the categories. The computer classification procedure assigned 537 patients to the same category as their clinical diagnosis, the majority of discrepancies between clinical and computer classifications involving common mi...

  3. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  4. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  5. Pedunculated Pulmonary Artery Sarcoma Suggested by Transthoracic Echocardiography.

    Science.gov (United States)

    Wang, Xiaobing; Ren, Weidong; Yang, Jun

    2016-04-01

    Pulmonary artery sarcoma (PAS) is an extremely rare malignancy. It is usually found after it grows large enough to occupy almost the entire lumen of the pulmonary artery and causes serious clinical symptoms. Thus, it is usually difficult to distinguish PAS from pulmonary thromboembolism based on imaging examinations. Few case reports had shown the attachment of PAS to pulmonary artery, a key characteristic for diagnosis, and differential diagnosis of PAS. In this case, we found a PAS, which did not cause local obstruction and some tumor emboli, which obstructed the branches of the pulmonary arteries and caused pulmonary hypertension and clinical symptoms. Transthoracic echocardiography (TTE) revealed a part of the tumor attached to the intima of the main pulmonary artery with a peduncle and had obvious mobility, which was suggestive of PAS and differentiated it from the pulmonary thromboembolism. To our knowledge, this is the first case report of a pedunculated PAS suggested by TTE. Combined with pulmonary artery computed tomography angiography, the diagnosis of PAS is strongly suggested before the operation. This case indicates that TTE could reveal the attachment and mobility of PAS in the main pulmonary and may provide useful information for the diagnosis and differential diagnosis of PAS, especially a pedunculated PAS. © 2015, Wiley Periodicals, Inc.

  6. Development of the evaluation methods in reactor safety analyses and core characteristics

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    In order to support the safety reviews by NRA on reactor safety design including the phenomena with multiple failures, the computer codes are developed and the safety evaluations with analyses are performed in the areas of thermal hydraulics and core characteristics evaluation. In the code preparation of safety analyses, the TRACE and RELAP5 code were prepared to conduct the safety analyses of LOCA and beyond design basis accidents with multiple failures. In the core physics code preparation, the functions of sensitivity and uncertainty analysis were incorporated in the lattice physics code CASMO-4. The verification of improved CASMO-4 /SIMULATE-3 was continued by using core physics data. (author)

  7. Investigating ASME allowable loads with finite element analyses

    International Nuclear Information System (INIS)

    Mattar Neto, Miguel; Bezerra, Luciano M.; Miranda, Carlos A. de J.; Cruz, Julio R.B.

    1995-01-01

    The evaluation of nuclear components using finite element analysis (FEA) does not generally fall into the shell type verification adopted by the ASME Code. Consequently, the demonstration that the modes of failure are avoided sometimes is not straightforward. Allowable limits, developed by limit load theory, require the computation of shell membrane and bending stresses. How to calculate these stresses from FEA is not necessarily self-evident. One approach to be considered is to develop recommendations in a case-by-case basis for the most common pressure vessel geometries and loads based on comparison between the results of elastic and plastic FEA. In this paper, FE analyses of common 2D and complex 3D geometries are examined and discussed. It will be clear that in the cases studied, stress separation and categorization are not self-evident and simple tasks to undertake. Certain unclear recommendations of ASME Code can lead the stress analyst to non conservative designs as will be demonstrated in this paper. At the endo of this paper, taking into account comparison between elastic and elastic-plastic FE results from ANSYS some observations, suggestions and conclusions about the degree of conservatism of the ASME recommendations will be addressed. (author)

  8. Analysing Music with Point-Set Compression Algorithms

    DEFF Research Database (Denmark)

    Meredith, David

    2016-01-01

    Several point-set pattern-discovery and compression algorithms designed for analysing music are reviewed and evaluated. Each algorithm takes as input a point-set representation of a score in which each note is represented as a point in pitch-time space. Each algorithm computes the maximal...... and sections in pieces of classical music. On the first task, the best-performing algorithms achieved success rates of around 84%. In the second task, the best algorithms achieved mean F1 scores of around 0.49, with scores for individual pieces rising as high as 0.71....

  9. An application of the 'Bayesian cohort model' to nuclear power plant cost analyses

    International Nuclear Information System (INIS)

    Ono, Kenji; Nakamura, Takashi

    2002-01-01

    We have developed a new method for identifying the effects of calendar year, plant age and commercial operation starting year on the costs and performances of nuclear power plants and also developed an analysis system running on personal computers. The method extends the Bayesian cohort model for time series social survey data proposed by one of the authors. The proposed method was shown to be able to separate the above three effects more properly than traditional methods such as taking simple means by time domain. The analyses of US nuclear plant cost and performance data by using the proposed method suggest that many of the US plants spent relatively long time and much capital cost for modification at their age of about 10 to 20 years, but that, after those ages, they performed fairly well with lower and stabilized O and M and additional capital costs. (author)

  10. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing.

    Science.gov (United States)

    Shatil, Anwar S; Younas, Sohail; Pourreza, Hossein; Figley, Chase R

    2015-01-01

    With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.

  11. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  12. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  13. Analyses and tests for the baking system of the RFX vacuum vessel by eddy currents

    Energy Technology Data Exchange (ETDEWEB)

    Collarin, P. [Gruppo di Padova per Ricerche sulla Fusione, Univ. di Padova (Italy); Sonato, P. [Gruppo di Padova per Ricerche sulla Fusione, Univ. di Padova (Italy); Zaccaria, P. [Gruppo di Padova per Ricerche sulla Fusione, Univ. di Padova (Italy); Zollino, G. [Gruppo di Padova per Ricerche sulla Fusione, Univ. di Padova (Italy)

    1995-12-31

    The electrical, thermal and mechanical analyses carried out for the design of a new baking system for RFX by eddy currents are presented. The results of an experimental test on RFX with low heating power are reported as well. They gave confidence in the numerical analyses so as the working conditions with the nominal heating power were computed. (orig.).

  14. Analyses and tests for the baking system of the RFX vacuum vessel by eddy currents

    International Nuclear Information System (INIS)

    Collarin, P.; Sonato, P.; Zaccaria, P.; Zollino, G.

    1995-01-01

    The electrical, thermal and mechanical analyses carried out for the design of a new baking system for RFX by eddy currents are presented. The results of an experimental test on RFX with low heating power are reported as well. They gave confidence in the numerical analyses so as the working conditions with the nominal heating power were computed. (orig.)

  15. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  16. GPU-computing in econophysics and statistical physics

    Science.gov (United States)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  17. Emergent computation a festschrift for Selim G. Akl

    CERN Document Server

    2017-01-01

    This book is dedicated to Professor Selim G. Akl to honour his groundbreaking research achievements in computer science over four decades. The book is an intellectually stimulating excursion into emergent computing paradigms, architectures and implementations. World top experts in computer science, engineering and mathematics overview exciting and intriguing topics of musical rhythms generation algorithms, analyse the computational power of random walks, dispelling a myth of computational universality, computability and complexity at the microscopic level of synchronous computation, descriptional complexity of error detection, quantum cryptography, context-free parallel communicating grammar systems, fault tolerance of hypercubes, finite automata theory of bulk-synchronous parallel computing, dealing with silent data corruptions in high-performance computing, parallel sorting on graphics processing units, mining for functional dependencies in relational databases, cellular automata optimisation of wireless se...

  18. The neural processing of voluntary completed, real and virtual violent and nonviolent computer game scenarios displaying predefined actions in gamers and nongamers.

    Science.gov (United States)

    Regenbogen, Christina; Herrmann, Manfred; Fehr, Thorsten

    2010-01-01

    Studies investigating the effects of violent computer and video game playing have resulted in heterogeneous outcomes. It has been assumed that there is a decreased ability to differentiate between virtuality and reality in people that play these games intensively. FMRI data of a group of young males with (gamers) and without (controls) a history of long-term violent computer game playing experience were obtained during the presentation of computer game and realistic video sequences. In gamers the processing of real violence in contrast to nonviolence produced activation clusters in right inferior frontal, left lingual and superior temporal brain regions. Virtual violence activated a network comprising bilateral inferior frontal, occipital, postcentral, right middle temporal, and left fusiform regions. Control participants showed extended left frontal, insula and superior frontal activations during the processing of real, and posterior activations during the processing of virtual violent scenarios. The data suggest that the ability to differentiate automatically between real and virtual violence has not been diminished by a long-term history of violent video game play, nor have gamers' neural responses to real violence in particular been subject to desensitization processes. However, analyses of individual data indicated that group-related analyses reflect only a small part of actual individual different neural network involvement, suggesting that the consideration of individual learning history is sufficient for the present discussion.

  19. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement

    OpenAIRE

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna; Nystr?m, Monica Elisabeth

    2016-01-01

    OBJECTIVES: Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. METHODS: We analysed 186 structured kaizen documents containing improvement suggest...

  20. VIPRE modeling of VVER-1000 reactor core for DNB analyses

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Y.; Nguyen, Q. [Westinghouse Electric Corporation, Pittsburgh, PA (United States); Cizek, J. [Nuclear Research Institute, Prague, (Czech Republic)

    1995-09-01

    Based on the one-pass modeling approach, the hot channels and the VVER-1000 reactor core can be modeled in 30 channels for DNB analyses using the VIPRE-01/MOD02 (VIPRE) code (VIPRE is owned by Electric Power Research Institute, Palo Alto, California). The VIPRE one-pass model does not compromise any accuracy in the hot channel local fluid conditions. Extensive qualifications include sensitivity studies of radial noding and crossflow parameters and comparisons with the results from THINC and CALOPEA subchannel codes. The qualifications confirm that the VIPRE code with the Westinghouse modeling method provides good computational performance and accuracy for VVER-1000 DNB analyses.

  1. Multi Scale Finite Element Analyses By Using SEM-EBSD Crystallographic Modeling and Parallel Computing

    International Nuclear Information System (INIS)

    Nakamachi, Eiji

    2005-01-01

    A crystallographic homogenization procedure is introduced to the conventional static-explicit and dynamic-explicit finite element formulation to develop a multi scale - double scale - analysis code to predict the plastic strain induced texture evolution, yield loci and formability of sheet metal. The double-scale structure consists of a crystal aggregation - micro-structure - and a macroscopic elastic plastic continuum. At first, we measure crystal morphologies by using SEM-EBSD apparatus, and define a unit cell of micro structure, which satisfy the periodicity condition in the real scale of polycrystal. Next, this crystallographic homogenization FE code is applied to 3N pure-iron and 'Benchmark' aluminum A6022 polycrystal sheets. It reveals that the initial crystal orientation distribution - the texture - affects very much to a plastic strain induced texture and anisotropic hardening evolutions and sheet deformation. Since, the multi-scale finite element analysis requires a large computation time, a parallel computing technique by using PC cluster is developed for a quick calculation. In this parallelization scheme, a dynamic workload balancing technique is introduced for quick and efficient calculations

  2. Risk assessment of computer-controlled safety systems for fusion reactors

    International Nuclear Information System (INIS)

    Fryer, M.O.; Bruske, S.Z.

    1983-01-01

    The complexity of fusion reactor systems and the need to display, analyze, and react promptly to large amounts of information during reactor operation will require a number of safety systems in the fusion facilities to be computer controlled. Computer software, therefore, must be included in the reactor safety analyses. Unfortunately, the science of integrating computer software into safety analyses is in its infancy. Combined plant hardware and computer software systems are often treated by making simple assumptions about software performance. This method is not acceptable for assessing risks in the complex fusion systems, and a new technique for risk assessment of combined plant hardware and computer software systems has been developed. This technique is an extension of the traditional fault tree analysis and uses structured flow charts of the software in a manner analogous to wiring or piping diagrams of hardware. The software logic determines the form of much of the fault trees

  3. 3D DEM analyses of the 1963 Vajont rock slide

    Science.gov (United States)

    Boon, Chia Weng; Houlsby, Guy; Utili, Stefano

    2013-04-01

    The 1963 Vajont rock slide has been modelled using the distinct element method (DEM). The open-source DEM code, YADE (Kozicki & Donzé, 2008), was used together with the contact detection algorithm proposed by Boon et al. (2012). The critical sliding friction angle at the slide surface was sought using a strength reduction approach. A shear-softening contact model was used to model the shear resistance of the clayey layer at the slide surface. The results suggest that the critical sliding friction angle can be conservative if stability analyses are calculated based on the peak friction angles. The water table was assumed to be horizontal and the pore pressure at the clay layer was assumed to be hydrostatic. The influence of reservoir filling was marginal, increasing the sliding friction angle by only 1.6˚. The results of the DEM calculations were found to be sensitive to the orientations of the bedding planes and cross-joints. Finally, the failure mechanism was investigated and arching was found to be present at the bend of the chair-shaped slope. References Boon C.W., Houlsby G.T., Utili S. (2012). A new algorithm for contact detection between convex polygonal and polyhedral particles in the discrete element method. Computers and Geotechnics, vol 44, 73-82, doi.org/10.1016/j.compgeo.2012.03.012. Kozicki, J., & Donzé, F. V. (2008). A new open-source software developed for numerical simulations using discrete modeling methods. Computer Methods in Applied Mechanics and Engineering, 197(49-50), 4429-4443.

  4. Ordinateur et communication (Computer and Communication).

    Science.gov (United States)

    Mangenot, Francois

    1994-01-01

    Because use of computers in second-language classrooms may tend to decrease interpersonal interaction, and therefore communication, ways to promote interaction are offered. These include small group computer projects, and suggestions are made for use with various computer functions and features: tutorials, word processing, voice recording,…

  5. Computation at the edge of chaos: Phase transition and emergent computation

    International Nuclear Information System (INIS)

    Langton, C.

    1990-01-01

    In order for computation to emerge spontaneously and become an important factor in the dynamics of a system, the material substrate must support the primitive functions required for computation: the transmission, storage, and modification of information. Under what conditions might we expect physical systems to support such computational primitives? This paper presents research on Cellular Automata which suggests that the optimal conditions for the support of information transmission, storage, and modification, are achieved in the vicinity of a phase transition. We observe surprising similarities between the behaviors of computations and systems near phase-transitions, finding analogs of computational complexity classes and the Halting problem within the phenomenology of phase-transitions. We conclude that there is a fundamental connection between computation and phase-transitions, and discuss some of the implications for our understanding of nature if such a connection is borne out. 31 refs., 16 figs

  6. Employee Resistance to Computer Technology.

    Science.gov (United States)

    Ewert, Alan

    1984-01-01

    The introduction of computers to the work place may cause employee stress. Aggressive, protective, and avoidance behaviors are forms of staff resistance. The development of good training programs will enhance productivity. Suggestions for evaluating computer systems are offered. (DF)

  7. Gamma spectrometric analyses of environmental samples at PINSTECH

    International Nuclear Information System (INIS)

    Faruq, M.U.; Parveen, N.; Ahmed, B.; Aziz, A.

    1979-01-01

    Gamma spectrometric analyses of air and other environmental samples from PINSTECH were carried out. Air particulate samples were analyzed by a Ge(Li) detector on a computer-based multichannel analyzer. Other environmental samples were analyzed by a Na(T1) scintillation detector spectrometer and a multichannel analyzer with manual analysis. Concentration of radionuclides in the media was determined and the sources of their production were identified. Age of the fall out was estimated from the ratios of the fission products. (authors)

  8. Open to Suggestion.

    Science.gov (United States)

    Journal of Reading, 1987

    1987-01-01

    Offers (1) suggestions for improving college students' study skills; (2) a system for keeping track of parent, teacher, and community contacts; (3) suggestions for motivating students using tic tac toe; (4) suggestions for using etymology to improve word retention; (5) a word search grid; and (6) suggestions for using postcards in remedial reading…

  9. Conceptual metaphors in computer networking terminology ...

    African Journals Online (AJOL)

    Lakoff & Johnson, 1980) is used as a basic framework for analysing and explaining the occurrence of metaphor in the terminology used by computer networking professionals in the information technology (IT) industry. An analysis of linguistic ...

  10. A study of computer-related upper limb discomfort and computer vision syndrome.

    Science.gov (United States)

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  11. Computer Self-Efficacy: A Practical Indicator of Student Computer Competency in Introductory IS Courses

    Directory of Open Access Journals (Sweden)

    Rex Karsten

    1998-01-01

    Full Text Available Students often receive their first college-level computer training in introductory information systems courses. Students and faculty frequently expect this training to develop a level of student computer competence that will support computer use in future courses. In this study, we applied measures of computer self-efficacy to students in a typical introductory IS course. The measures provided useful evidence that student perceptions of their ability to use computers effectively in the future significantly improved as a result of their training experience. The computer self-efficacy measures also provided enhanced insight into course-related factors of practical concern to IS educators. Study results also suggest computer self-efficacy measures may be a practical and informative means of assessing computer-training outcomes in the introductory IS course context

  12. Professional Problems Experienced by Information Technology Teachers and Suggested Solutions: Longitudinal Survey

    Directory of Open Access Journals (Sweden)

    Hafize Keser

    2013-02-01

    Full Text Available The study aimed to determine the opinions of teacher candidates in the fourth year of Computer Education & Instructional Technologies department (CEIT on the Problems Experienced by Information Technology (IT Teachers and Suggested Solutions and it has been designed in case study routine taking place within qualitative research tradition and in a longitudinal survey model. The final year IT teacher candidates receiving education in Ankara University Educational Sciences Faculty CEIT department in academic years 2011-2012 and 2012-2013 have formed the study group of the research (N=123. The data obtained in the research by open-end questionnaire have been analysed and interpreted by inductive encoding technique, freuency analysis and descriptive content analysis. At the end of the study carried out, it has been determined that the IT teacher candidates have handled in two sub-dimensions the problems experienced by the IT teachers, these being, problems towards the courses the IT teachers attend and problems regarding the professional lives of IT teachers and suggested solutions in line with these. The leading problems towards the courses the IT teachers attend are that the courses are optional, the courses are grade-free, the course hours are few, the significance of IT not being comprehended very well by the executives, teachers, parents and students, inadequacy of physical means of IT classes and references of the course. And, the main problems regarding professional lives of IT course teachers are the duty, power and responsibility of IT teachers not made clear enough, difficulties in formative teacher practice, course hours which have to be completed by IT teachers not being able to be completed and problem of permanent staff, the courses that must be attended by IT teachers being taugth by teachers from other branches, lack of executives and experts trained from a field to supervise IT and formative teachers. And, the suggested leading

  13. Electromagnetic Compatibility Design of the Computer Circuits

    Science.gov (United States)

    Zitai, Hong

    2018-02-01

    Computers and the Internet have gradually penetrated into every aspect of people’s daily work. But with the improvement of electronic equipment as well as electrical system, the electromagnetic environment becomes much more complex. Electromagnetic interference has become an important factor to hinder the normal operation of electronic equipment. In order to analyse the computer circuit compatible with the electromagnetic compatibility, this paper starts from the computer electromagnetic and the conception of electromagnetic compatibility. And then, through the analysis of the main circuit and system of computer electromagnetic compatibility problems, we can design the computer circuits in term of electromagnetic compatibility. Finally, the basic contents and methods of EMC test are expounded in order to ensure the electromagnetic compatibility of equipment.

  14. Cluster analysis of Helicobacter pylori genomic DNA fingerprints suggests gastroduodenal disease-specific associations.

    Science.gov (United States)

    Go, M F; Chan, K Y; Versalovic, J; Koeuth, T; Graham, D Y; Lupski, J R

    1995-07-01

    Helicobacter pylori infection is now accepted as the most common cause of chronic active gastritis and peptic ulcer disease. The etiologies of many infectious diseases have been attributed to specific or clonal strains of bacterial pathogens. Polymerase chain reaction (PCR) amplification of DNA between repetitive DNA sequences, REP elements (REP-PCR), has been utilized to generate DNA fingerprints to examine similarity among strains within a bacterial species. Genomic DNA from H. pylori isolates obtained from 70 individuals (39 duodenal ulcers and 31 simple gastritis) was PCR-amplified using consensus probes to repetitive DNA elements. The H. pylori DNA fingerprints were analyzed for similarity and correlated with disease presentation using the NTSYS-pc computer program. Each H. pylori strain had a distinct DNA fingerprint except for two pairs. Single-colony DNA fingerprints of H. pylori from the same patient were identical, suggesting that each patient harbors a single strain. Computer-assisted cluster analysis of the REP-PCR DNA fingerprints showed two large clusters of isolates, one associated with simple gastritis and the other with duodenal ulcer disease. Cluster analysis of REP-PCR DNA fingerprints of H. pylori strains suggests that duodenal ulcer isolates, as a group, are more similar to one another and different from gastritis isolates. These results suggest that disease-specific strains may exist.

  15. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  16. Consumer hypnotic-like suggestibility: possible mechanism in compulsive purchasing.

    Science.gov (United States)

    Prete, M Irene; Guido, Gianluigi; Pichierri, Marco

    2013-08-01

    The authors hypothesize a concept, Consumer Hypnotic-Like Suggestibility (CHLS), defined as an altered state of consciousness, as a state causing a tendency to respond positively to messages aimed at inducing consumers to make unplanned purchases. This study aims to investigate the associations of CHLS with interpersonal variables and compulsive purchasing--a frequent and uncontrollable preoccupation with buying or impulses to buy. A study was conducted on a sample of 232 subjects (n = 111 men; M age = 41 yr.), through the administration of a questionnaire, which measured: CHLS, compulsive purchasing, consumer susceptibility to interpersonal influence (the necessity to enhance one's image in the opinion of others through the consumption of products), and consumer atmospherics, i.e., environmental stimuli known to influence purchasing decisions. Modeling and mediation analyses suggested that internal and external drivers--Consumer Susceptibility to Interpersonal Influence and atmospherics--are positively related to CHLS which affects compulsive purchasing.

  17. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    International Nuclear Information System (INIS)

    Moore, Kevin L.; Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy

  18. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92093 (United States); Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); McNutt, Todd R. [Department of Radiation Oncology and Molecular Radiation Science, School of Medicine, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Mutic, Sasa [Department of Radiation Oncology, Washington University in St. Louis, St. Louis, Missouri 63110 (United States)

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  19. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    Science.gov (United States)

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  20. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  1. Angular analyses in relativistic quantum mechanics; Analyses angulaires en mecanique quantique relativiste

    Energy Technology Data Exchange (ETDEWEB)

    Moussa, P [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les developpements angulaires. Enfin on fournit un substitut a la

  2. Good enough practices in scientific computing.

    Science.gov (United States)

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  3. Use of cloud computing in biomedicine.

    Science.gov (United States)

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  4. Neutronic analyses and tools development efforts in the European DEMO programme

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, U., E-mail: ulrich.fischer@kit.edu [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Bachmann, C. [European Fusion Development Agreement (EFDA), Garching (Germany); Bienkowska, B. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Catalan, J.P. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Drozdowicz, K.; Dworak, D. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Leichtle, D. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Fusion for Energy (F4E), Barcelona (Spain); Lengar, I. [MESCS-JSI, Ljubljana (Slovenia); Jaboulay, J.-C. [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France); Lu, L. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Moro, F. [Associazione ENEA-Euratom, ENEA Fusion Division, Frascati (Italy); Mota, F. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Sanz, J. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Szieberth, M. [Budapest University of Technology and Economics (BME), Budapest (Hungary); Palermo, I. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Pampin, R. [Fusion for Energy (F4E), Barcelona (Spain); Porton, M. [Euratom/CCFE Fusion Association, Culham Science Centre for Fusion Energy (CCFE), Culham (United Kingdom); Pereslavtsev, P. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Ogando, F. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Rovni, I. [Budapest University of Technology and Economics (BME), Budapest (Hungary); and others

    2014-10-15

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools.

  5. Neutronic analyses and tools development efforts in the European DEMO programme

    International Nuclear Information System (INIS)

    Fischer, U.; Bachmann, C.; Bienkowska, B.; Catalan, J.P.; Drozdowicz, K.; Dworak, D.; Leichtle, D.; Lengar, I.; Jaboulay, J.-C.; Lu, L.; Moro, F.; Mota, F.; Sanz, J.; Szieberth, M.; Palermo, I.; Pampin, R.; Porton, M.; Pereslavtsev, P.; Ogando, F.; Rovni, I.

    2014-01-01

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools

  6. Nature, computation and complexity

    International Nuclear Information System (INIS)

    Binder, P-M; Ellis, G F R

    2016-01-01

    The issue of whether the unfolding of events in the world can be considered a computation is explored in this paper. We come to different conclusions for inert and for living systems (‘no’ and ‘qualified yes’, respectively). We suggest that physical computation as we know it exists only as a tool of complex biological systems: us. (paper)

  7. Factor structure of suggestibility revisited: new evidence for direct and indirect suggestibility

    OpenAIRE

    Romuald Polczyk

    2016-01-01

    Background Yielding to suggestions can be viewed as a relatively stable individual trait, called suggestibility. It has been long proposed that there are two kinds of suggestible influence, and two kinds of suggestibility corresponding to them: direct and indirect. Direct suggestion involves overt unhidden influence, while indirect suggestion concerns influence that is hidden, and the participant does not know that the suggestibility is being measured. So far however, empirical evidence ...

  8. CFD analyses of steam and hydrogen distribution in a nuclear power plant

    International Nuclear Information System (INIS)

    Siccama, N.B.; Houkema, M.; Komen, E.M.J.

    2003-01-01

    A detailed three-dimensional Computational Fluid Dynamics (CFD) model of the containment of the nuclear power plant has been prepared in order to assess possible multidimensional phenomena. In a first code-to-code comparison step, the CFD model has been used to compute a reference accident scenario which has been analysed earlier with the lumped parameter code SPECTRA. The CFD results compare qualitatively well with the SPECTRA results. Subsequently, the actual steam jet from the primary system has been modelled in the CFD code in order to determine the hydrogen distribution for this realistically modelled source term. Based on the computed hydrogen distributions, it has been determined when use of lumped parameter codes is allowed and when use of CFD codes is required. (author)

  9. Analyse des erreurs dans les calculs sur ordinateurs Error Analysis in Computing

    Directory of Open Access Journals (Sweden)

    Vignes J.

    2006-11-01

    Full Text Available La méthode présentée ici permet d'évaluer l'erreur sur les résultats d'algorithmes, erreurs dues à l'arithmétique à précision limitée de la machines L'idée de base de cette méthode est qu'à un algorithme donné fournissant un résultat algébrique unique r, correspond en informatique un ensemble R de résultats numériques qui sont tous représentatifs de résultat exact r. La méthode de permutation-perturbation que nous présentons ici permet d'obtenir les éléments de R. La perturbation agit sur les données et résultats de chaque opération élémentaire. La permutation agit sur l'ordre d'exécution des opérations. Une étude statistique des éléments de R permet d'estimer l'erreur commise. Dans la pratique, il suffit de 2 ou 3 éléments de R pour connaître cette erreur. This paper describes a new method for evaluating the error in the results of computation of an algorithm. The basic idea underlying the method is that while in algebra a given algorithm provides a single result r, this same algorithm carried out on a computer provides a set R of numerical results that are ail representative of the exact algebraic result r. The permutation-perturbation method described here can be used to obtain the elements of R. The perturbation acts on the data and results of each elementary operation, and the permutation acts on the order in which operations are carried out. A statistical analysis of the elements of R is performed to determine the error committed. In practice, 2 to 4 R elements are sufficient for determining the error.

  10. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  11. Computers in Schools: White Boys Only?

    Science.gov (United States)

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  12. Applications of RETRAN-3D for nuclear power plant transient analyses

    International Nuclear Information System (INIS)

    Paulsen, M.P.; Gose, G.C.; McFadden, J.H.; Agee, L.J.

    1996-01-01

    The RETRAN-3D computer program has been developed to analyze reactor events for which nonequilibrium thermodynamics, multidimensional neutron kinetics, or the presence of noncondensable gases are important items for consideration. This paper summarizes the features of RETRAN-3D and the analyses that have been performed to provide the verification and validation of the program

  13. Maintaining scale as a realiable computational system for criticality safety analysis

    International Nuclear Information System (INIS)

    Bowmann, S.M.; Parks, C.V.; Martin, S.K.

    1995-01-01

    Accurate and reliable computational methods are essential for nuclear criticality safety analyses. The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer code system was originally developed at Oak Ridge National Laboratory (ORNL) to enable users to easily set up and perform criticality safety analyses, as well as shielding, depletion, and heat transfer analyses. Over the fifteen-year life of SCALE, the mainstay of the system has been the criticality safety analysis sequences that have featured the KENO-IV and KENO-V.A Monte Carlo codes and the XSDRNPM one-dimensional discrete-ordinates code. The criticality safety analysis sequences provide automated material and problem-dependent resonance processing for each criticality calculation. This report details configuration management which is essential because SCALE consists of more than 25 computer codes (referred to as modules) that share libraries of commonly used subroutines. Changes to a single subroutine in some cases affect almost every module in SCALE exclamation point Controlled access to program source and executables and accurate documentation of modifications are essential to maintaining SCALE as a reliable code system. The modules and subroutine libraries in SCALE are programmed by a staff of approximately ten Code Managers. The SCALE Software Coordinator maintains the SCALE system and is the only person who modifies the production source, executables, and data libraries. All modifications must be authorized by the SCALE Project Leader prior to implementation

  14. Analysis of Sci-Hub downloads of computer science papers

    Directory of Open Access Journals (Sweden)

    Andročec Darko

    2017-07-01

    Full Text Available The scientific knowledge is disseminated by research papers. Most of the research literature is copyrighted by publishers and avail- able only through paywalls. Recently, some websites offer most of the recent content for free. One of them is the controversial website Sci-Hub that enables access to more than 47 million pirated research papers. In April 2016, Science Magazine published an article on Sci-Hub activity over the period of six months and publicly released the Sci-Hub’s server log data. The mentioned paper aggregates the view that relies on all downloads and for all fields of study, but these findings might be hiding interesting patterns within computer science. The mentioned Sci-Hub log data was used in this paper to analyse downloads of computer science papers based on DBLP’s list of computer science publications. The top downloads of computer science papers were analysed, together with the geographical location of Sci-Hub users, the most downloaded publishers, types of papers downloaded, and downloads of computer science papers per publication year. The results of this research can be used to improve legal access to the most relevant scientific repositories or journals for the computer science field.

  15. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing

    Science.gov (United States)

    Shatil, Anwar S.; Younas, Sohail; Pourreza, Hossein; Figley, Chase R.

    2015-01-01

    With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications. PMID:27279746

  16. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing

    Directory of Open Access Journals (Sweden)

    Anwar S. Shatil

    2015-01-01

    Full Text Available With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1 inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2 highlight their main advantages; 3 discuss when it may (and may not be advisable to use them; 4 review some of their potential problems and barriers to access; and finally 5 give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc., a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.

  17. clubber: removing the bioinformatics bottleneck in big data analyses

    Science.gov (United States)

    Miller, Maximilian; Zhu, Chengsheng; Bromberg, Yana

    2018-01-01

    With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these “big data” analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber’s goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC) resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min) clearly illustrate the importance of clubber in the everyday computational biology environment. PMID:28609295

  18. clubber: removing the bioinformatics bottleneck in big data analyses.

    Science.gov (United States)

    Miller, Maximilian; Zhu, Chengsheng; Bromberg, Yana

    2017-06-13

    With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these "big data" analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber's goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC) resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min) clearly illustrate the importance of clubber in the everyday computational biology environment.

  19. clubber: removing the bioinformatics bottleneck in big data analyses

    Directory of Open Access Journals (Sweden)

    Miller Maximilian

    2017-06-01

    Full Text Available With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these “big data” analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber’s goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min clearly illustrate the importance of clubber in the everyday computational biology environment.

  20. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  1. Hypnotism as a Function of Trance State Effects, Expectancy, and Suggestibility: An Italian Replication.

    Science.gov (United States)

    Pekala, Ronald J; Baglio, Francesca; Cabinio, Monia; Lipari, Susanna; Baglio, Gisella; Mendozzi, Laura; Cecconi, Pietro; Pugnetti, Luigi; Sciaky, Riccardo

    2017-01-01

    Previous research using stepwise regression analyses found self-reported hypnotic depth (srHD) to be a function of suggestibility, trance state effects, and expectancy. This study sought to replicate and expand that research using a general state measure of hypnotic responsivity, the Phenomenology of Consciousness Inventory: Hypnotic Assessment Procedure (PCI-HAP). Ninety-five participants completed an Italian translation of the PCI-HAP, with srHD scores predicted from the PCI-HAP assessment items. The regression analysis replicated the previous research results. Additionally, stepwise regression analyses were able to predict the srHD score equally well using only the PCI dimension scores. These results not only replicated prior research but suggest how this methodology to assess hypnotic responsivity, when combined with more traditional neurophysiological and cognitive-behavioral methodologies, may allow for a more comprehensive understanding of that enigma called hypnosis.

  2. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  3. Review of accident analyses performed at Mochovce NPP

    International Nuclear Information System (INIS)

    Siko, D.

    2000-01-01

    In this paper the review of accident analysis performed in NPP Mochovce V-1 is presented. The scope of these safety measures was defined and development in the T SSM for NPP Mochovce Nuclear Safety Improvements Report' issued in July 1995. The main objectives of these safety measures were the followings: (a) to establish the criteria for selection and classification of accidental events, as well as defining the list of initiating events to be analysed. Accident classification to the individual groups must be performed in accordance with RG 1.70 and IAEA recommendations 'Guidelines for Accidental Analysis of WWER NPP' (IAEA-EBR-WWER-01) to select boundary cases to be calculated from the scope of initiating events; (b ) to elaborate the accident analysis methodology that also includes acceptance criteria for their result evaluation, initial and boundary conditions, assumption related with the application of the single failure criteria, requirements on the analysis quality, used computer codes, as well as NPP models and input data for the accident analysis; (c) to perform the accident analysis for the Pre-operational Safety Report (POSAR); (d) to provide a synthetic report addressing the validity range of codes models and correlations, the assessment against relevant tests results, the evidence of the user qualification, the modernisation and nodding scheme for the plant and the justification of used computer codes. Analyses results showed that all acceptance criteria were met with satisfactory margin and design of the NPP Mochovce is accurate. (author)

  4. Cloud computing: a new business paradigm for biomedical information sharing.

    Science.gov (United States)

    Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti

    2010-04-01

    We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.

  5. Efficient Multi-Party Computation over Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Fehr, Serge; Ishai, Yuval

    2003-01-01

    Secure multi-party computation (MPC) is an active research area, and a wide range of literature can be found nowadays suggesting improvements and generalizations of existing protocols in various directions. However, all current techniques for secure MPC apply to functions that are represented by ...... the usefulness of the above results by presenting a novel application of MPC over (non-field) rings to the round-efficient secure computation of the maximum function. Basic Research in Computer Science (www.brics.dk), funded by the Danish National Research Foundation.......Secure multi-party computation (MPC) is an active research area, and a wide range of literature can be found nowadays suggesting improvements and generalizations of existing protocols in various directions. However, all current techniques for secure MPC apply to functions that are represented...... by (boolean or arithmetic) circuits over finite fields. We are motivated by two limitations of these techniques: – Generality. Existing protocols do not apply to computation over more general algebraic structures (except via a brute-force simulation of computation in these structures). – Efficiency. The best...

  6. Analysing Trust Transitivity and The Effects of Unknown Dependence

    Directory of Open Access Journals (Sweden)

    Touhid Bhuiyan

    2010-03-01

    Full Text Available Trust can be used to improve online automated recommendation within a given domain. Trust transitivity is used to make it successful. But trust transitivity has different interpretations. Trust and trust transitivity; both are the human mental phenomenon and for this reason, there is no such thing as objective transitivity. Trust transitivity and trust fusion both are important elements in computational trust. This paper analyses the parameter dependence problem in trust transitivity and proposes some definitions considering the effects of base rate. In addition, it also proposes belief functions based on subjective logic to analyse trust transitivity of three specified cases with sensitive and insensitive based rate. Then it presents a quantitative analysis of the effects of unknown dependence problem in an interconnected network environment; such Internet.

  7. Analysing population numbers of the house sparrow in the Netherlands with a matrix model and suggestions for conservation measures

    NARCIS (Netherlands)

    Klok, C.; Holtkamp, R.; Apeldoorn, van R.C.; Visser, M.E.; Hemerik, L.

    2006-01-01

    The House Sparrow (Passer domesticus), formerly a common bird species, has shown a rapid decline in Western Europe over recent decades. In The Netherlands, its decline is apparent from 1990 onwards. Many causes for this decline have been suggested that all decrease the vital rates, i.e. survival and

  8. Sobol method application in dimensional sensitivity analyses of different AFM cantilevers for biological particles

    Science.gov (United States)

    Korayem, M. H.; Taheri, M.; Ghahnaviyeh, S. D.

    2015-08-01

    Due to the more delicate nature of biological micro/nanoparticles, it is necessary to compute the critical force of manipulation. The modeling and simulation of reactions and nanomanipulator dynamics in a precise manipulation process require an exact modeling of cantilevers stiffness, especially the stiffness of dagger cantilevers because the previous model is not useful for this investigation. The stiffness values for V-shaped cantilevers can be obtained through several methods. One of them is the PBA method. In another approach, the cantilever is divided into two sections: a triangular head section and two slanted rectangular beams. Then, deformations along different directions are computed and used to obtain the stiffness values in different directions. The stiffness formulations of dagger cantilever are needed for this sensitivity analyses so the formulations have been driven first and then sensitivity analyses has been started. In examining the stiffness of the dagger-shaped cantilever, the micro-beam has been divided into two triangular and rectangular sections and by computing the displacements along different directions and using the existing relations, the stiffness values for dagger cantilever have been obtained. In this paper, after investigating the stiffness of common types of cantilevers, Sobol sensitivity analyses of the effects of various geometric parameters on the stiffness of these types of cantilevers have been carried out. Also, the effects of different cantilevers on the dynamic behavior of nanoparticles have been studied and the dagger-shaped cantilever has been deemed more suitable for the manipulation of biological particles.

  9. SRM Internal Flow Tests and Computational Fluid Dynamic Analysis. Volume 2; CFD RSRM Full-Scale Analyses

    Science.gov (United States)

    2001-01-01

    This document presents the full-scale analyses of the CFD RSRM. The RSRM model was developed with a 20 second burn time. The following are presented as part of the full-scale analyses: (1) RSRM embedded inclusion analysis; (2) RSRM igniter nozzle design analysis; (3) Nozzle Joint 4 erosion anomaly; (4) RSRM full motor port slag accumulation analysis; (5) RSRM motor analysis of two-phase flow in the aft segment/submerged nozzle region; (6) Completion of 3-D Analysis of the hot air nozzle manifold; (7) Bates Motor distributed combustion test case; and (8) Three Dimensional Polysulfide Bump Analysis.

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. Insights from random vibration analyses using multiple earthquake components

    International Nuclear Information System (INIS)

    DebChaudhury, A.; Gasparini, D.A.

    1981-01-01

    The behavior of multi-degree-of-freedom systems subjected to multiple earthquake components is studied by the use of random vibration dynamic analyses. A linear system which has been decoupled into modes and has both translational and rotational degrees of freedom is analyzed. The seismic excitation is modelled as a correlated or uncorrelated, vector-valued, non-stationary random process having a Kanai-Tajimi type of frequency content. Non-stationarity is achieved by using a piece wise linear strength function. Therefore, almost any type of evolution and decay of an earthquake may be modelled. Also, in general, the components of the excitation have different frequency contents and strength functions; i.e. intensities and durations and the correlations between components can vary with time. A state-space, modal, random vibration approach is used. Exact analytical expressions for both the state transition matrix and the evolutionary modal covariance matrix are utilized to compute time histories of modal RMS responses. Desired responses are then computed by modal superposition. Specifically, relative displacement, relative velocity and absolute acceleration responses are studied. An important advantage of such analyses is that RMS responses vary smoothly in time therefore large time intervals may be used to generate response time histories. The modal superposition is exact; that is, all cross correlation terms between modal responses are included. (orig./RW)

  13. Disentangling the complex evolutionary history of the Western Palearctic blue tits (Cyanistes spp.) - phylogenomic analyses suggest radiation by multiple colonization events and subsequent isolation.

    Science.gov (United States)

    Stervander, Martin; Illera, Juan Carlos; Kvist, Laura; Barbosa, Pedro; Keehnen, Naomi P; Pruisscher, Peter; Bensch, Staffan; Hansson, Bengt

    2015-05-01

    Isolated islands and their often unique biota continue to play key roles for understanding the importance of drift, genetic variation and adaptation in the process of population differentiation and speciation. One island system that has inspired and intrigued evolutionary biologists is the blue tit complex (Cyanistes spp.) in Europe and Africa, in particular the complex evolutionary history of the multiple genetically distinct taxa of the Canary Islands. Understanding Afrocanarian colonization events is of particular importance because of recent unconventional suggestions that these island populations acted as source of the widespread population in mainland Africa. We investigated the relationship between mainland and island blue tits using a combination of Sanger sequencing at a population level (20 loci; 12 500 nucleotides) and next-generation sequencing of single population representatives (>3 200 000 nucleotides), analysed in coalescence and phylogenetic frameworks. We found (i) that Afrocanarian blue tits are monophyletic and represent four major clades, (ii) that the blue tit complex has a continental origin and that the Canary Islands were colonized three times, (iii) that all island populations have low genetic variation, indicating low long-term effective population sizes and (iv) that populations on La Palma and in Libya represent relicts of an ancestral North African population. Further, demographic reconstructions revealed (v) that the Canary Islands, conforming to traditional views, hold sink populations, which have not served as source for back colonization of the African mainland. Our study demonstrates the importance of complete taxon sampling and an extensive multimarker study design to obtain robust phylogeographical inferences. © 2015 John Wiley & Sons Ltd.

  14. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  15. [Suggestions for stress relieve at the workplace: opinion of postgraduate nurses].

    Science.gov (United States)

    Martins, L M; Bronzatti, J A; Vieira, C S; Parra, S H; da Silva, Y B

    2000-03-01

    The work overload, relationship and communication problems, institution's characteristics and ambiental pollution were the stressing organization agents with the greatest punctuation in this study composed by 30 nurses. The work planning, work humanization, suitable human resources, improving communication and continued education were the suggestions given by nurses to minimize these stressing organizational agents. Analysing the stressing extra organizational agents, economic and familiar problems, work distance and transportation to work, were the most pointed.

  16. The Use of Computer Tools in the Design Process of Students’ Architectural Projects. Case Studies in Algeria

    Science.gov (United States)

    Saighi, Ouafa; Salah Zerouala, Mohamed

    2017-12-01

    This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.

  17. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Dionne, B.

    2011-01-01

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  18. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  19. Analyses of criticality and reactivity for TRACY experiments based on JENDL-3.3 data library

    International Nuclear Information System (INIS)

    Sono, Hiroki; Miyoshi, Yoshinori; Nakajima, Ken

    2003-01-01

    The parameters on criticality and reactivity employed for computational simulations of the TRACY supercritical experiments were analyzed using a recently revised nuclear data library, JENDL-3.3. The parameters based on the JENDL-3.3 library were compared to those based on two former-used libraries, JENDL-3.2 and ENDF/B-VI. In the analyses computational codes, MVP, MCNP version 4C and TWOTRAN, were used. The following conclusions were obtained from the analyses: (1) The computational biases of the effective neutron multiplication factor attributable to the nuclear data libraries and to the computational codes do not depend the TRACY experimental conditions such as fuel conditions. (2) The fractional discrepancies in the kinetic parameters and coefficients of reactivity are within ∼5% between the three libraries. By comparison between calculations and measurements of the parameters, the JENDL-3.3 library is expected to give closer values to the measurements than the JENDL-3.2 and ENDF/B-VI libraries. (3) While the reactivity worth of transient rods expressed in the $ unit shows ∼5% discrepancy between the three libraries according to their respective β eff values, there is little discrepancy in that expressed in the Δk/k unit. (author)

  20. TEACHERS’ COMPUTER SELF-EFFICACY AND THEIR USE OF EDUCATIONAL TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vehbi TUREL

    2014-10-01

    Full Text Available This study examined the use of educational technology by primary and subject teachers (i.e. secondary and high school teachers in a small town in the eastern part of Turkey in the spring of 2012. The study examined the primary, secondary and high school teachers’ Ø personal and computer related (demographic characteristics, Ø their computer self-efficacy perceptions, Ø their computer-using level in certain software, Ø their frequency of computer use for teaching, administrative and communication objectives, and Ø their use of educational technology preferences for preparation and teaching purposes. In this study, all primary, secondary and high school teachers in the small town were given the questionnaires to complete. 158 teachers (n=158 completed and returned them. The study was mostly quantitative and partly qualitative. The quantitative results were analysed with SPSS (i.e. mean, Std. Deviation, frequency, percentage, ANOVA. The qualitative data were analysed with examining the participants’ responses gathered from the open-ended questions and focussing on the shared themes among the responses. The results reveal that the teachers think that they have good computer self-efficacy perceptions, their level in certain programs is good, and they often use computers for a wide range of purposes. There are also statistical differences between; Ø their computer self-efficacy perceptions, Ø frequency of computer use for certain purposes, and Ø computer level in certain programs in terms of different independent variables.

  1. Sensitivity and uncertainty analyses applied to criticality safety validation. Volume 2

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies developed in Volume 1 to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the existing S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently in use by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The methods for application of S/U and generalized linear-least-square methodology (GLLSM) tools to the criticality safety validation procedures were described in Volume 1 of this report. Volume 2 of this report presents the application of these procedures to the validation of criticality safety analyses supporting uranium operations where enrichments are greater than 5 wt %. Specifically, the traditional k eff trending analyses are compared with newly developed k eff trending procedures, utilizing the D and c k coefficients described in Volume 1. These newly developed procedures are applied to a family of postulated systems involving U(11)O 2 fuel, with H/X values ranging from 0--1,000. These analyses produced a series of guidance and recommendations for the general usage of these various techniques. Recommendations for future work are also detailed

  2. Young children reorient by computing layout geometry, not by matching images of the environment.

    Science.gov (United States)

    Lee, Sang Ah; Spelke, Elizabeth S

    2011-02-01

    Disoriented animals from ants to humans reorient in accord with the shape of the surrounding surface layout: a behavioral pattern long taken as evidence for sensitivity to layout geometry. Recent computational models suggest, however, that the reorientation process may not depend on geometrical analyses but instead on the matching of brightness contours in 2D images of the environment. Here we test this suggestion by investigating young children's reorientation in enclosed environments. Children reoriented by extremely subtle geometric properties of the 3D layout: bumps and ridges that protruded only slightly off the floor, producing edges with low contrast. Moreover, children failed to reorient by prominent brightness contours in continuous layouts with no distinctive 3D structure. The findings provide evidence that geometric layout representations support children's reorientation.

  3. Conceptual and computational basis for the quantification of margins and uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon Craig

    2009-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e, Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainty (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. Topics considered include (1) the role of aleatory and epistemic uncertainty in QMU, (2) the representation of uncertainty with probability, (3) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, (4) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty, (5) procedures for sampling-based uncertainty and sensitivity analysis, (6) the representation of uncertainty with alternatives to probability such as interval analysis, possibility theory and evidence theory, (7) the representation of uncertainty with alternatives to probability in QMU analyses involving only epistemic uncertainty, and (8) the representation of uncertainty with alternatives to probability in QMU analyses involving aleatory and epistemic uncertainty. Concepts and computational procedures are illustrated with both notional examples and examples from reactor safety and radioactive waste disposal.

  4. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  5. Students Computer Skills in Faculty of Education

    Directory of Open Access Journals (Sweden)

    Mehmet Caglar

    2010-09-01

    Full Text Available Nowadays; the usage of technology is not a privilege but an obligation. Technological developments influence structures andfunctions of educational institutions. It is also expected from the teachers that they integrate technology in their lessons inorder to educate the individuals of information society. This research has covered 145(68 female, 78 male students, studying inNear East University Faculty of Education. The Computer Skills Scale developed by Güçlü (2010 was used as a data collectingtool. Data were analysed using SPSS software program. In this study, students’ computer skills were investigated; the variationsin the relationships between computer skills and (a gender, (b family’s net monthly income, (c presence of computers athome, (d presence of a computer laboratory at school and (e parents’ computer skills were examined. Frequency analysis,percentage and mean calculations were used. In addition, t-test and multi-variate analysis were used to look at the relationshipbetween different variables. As a result of this study, a statistically significant relationship between computer skills of studentswho had a computer at home and computer skills of those who didn’t have a computer at home were found.

  6. Understanding the Critics of Educational Technology: Gender Inequities and Computers 1983-1993.

    Science.gov (United States)

    Mangione, Melissa

    Although many view computers purely as technological tools to be utilized in the classroom and workplace, attention has been drawn to the social differences computers perpetuate, including those of race, class, and gender. This paper focuses on gender and computing by examining recent analyses in regards to content, form, and usage concerns. The…

  7. Analyses of transient plant response under emergency situations

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Kazuya [Advanced Reactor Technology, Co. Ltd., Engineering Department, Tokyo (Japan); Shimakawa, Yoshio; Hishida, Masahiko [Mitsubishi Heavy Industry, Ltd., Reactor Core Engineering and Safety Engineering Department, Tokyo (Japan)

    1999-03-01

    In order to support development of the dynamic reliability analysis program DYANA, analyses were made on the event sequences anticipated under emergency situations using the plant dynamics simulation computer code Super-COPD. The analytical models were developed for Super-COPD such as the guard vessel, the maintenance cooling system, the sodium overflow and makeup system, etc. in order to apply the code to the simulation of the emergency situations. The input data were prepared for the analyses. About 70 sequences were analyzed, which are categorized into the following events: (1) PLOHS (Protected Loss of Heat Sink), (2) LORL (Loss of Reactor Level)-J: failure of sodium makeup by the primary sodium overflow and makeup system, (3) LORL-G : failure of primary coolant pump trip, (4) LORL-I: failure of the argon cover gas isolation, and (5) heat removal only using the ventilation system of the primary cooling system rooms. The results were integrated into an input file for preparing the functions for the neural network simulation. (author)

  8. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  9. Cloud Computing Adoption in Organisations: Review of Empirical Literature

    Directory of Open Access Journals (Sweden)

    Hassan Haslinda

    2017-01-01

    Full Text Available This study reviews literature on cloud computing adoption in organisations to identify its influential factors and its operationalisation in prior literature. We classify the factors that influence the cloud computing adoption using the three contexts suggested by the Technology-Organisation-Environment (TOE framework, namely, technology, organisation, and environment. The finding suggests that the influences of these factors vary across studies and most of the studies have operationalised cloud computing adoption using intention to adopt cloud computing or binary variable, rather than the actual use of the technology.

  10. Designing for deeper learning in a blended computer science course for middle school students

    Science.gov (United States)

    Grover, Shuchi; Pea, Roy; Cooper, Stephen

    2015-04-01

    The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were

  11. Fluid dynamics computer programs for NERVA turbopump

    Science.gov (United States)

    Brunner, J. J.

    1972-01-01

    During the design of the NERVA turbopump, numerous computer programs were developed for the analyses of fluid dynamic problems within the machine. Program descriptions, example cases, users instructions, and listings for the majority of these programs are presented.

  12. Insights from Severe Accident Analyses for Verification of VVER SAMG

    Energy Technology Data Exchange (ETDEWEB)

    Gaikwad, A. J.; Rao, R. S.; Gupta, A.; Obaidurrahaman, K., E-mail: avinashg@aerb.gov.in [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai (India)

    2014-10-15

    The severe accident analyses of simultaneous rupture of all four steam lines (case-a), simultaneous occurrence of LOCA with SBO (case-b) and Station blackout (case-c) were performed with the computer code ASTEC V2r2 for a typical VVER-1000. The results obtained will be used for verification of sever accident provisions and Severe Accident Management Guidelines (SAMG). Auxiliary feed water and emergency core cooling systems are modelled as boundary conditions. The ICARE module is used to simulate the reactor core, which is divided into five radial regions by grouping similarly powered fuel assemblies together. Initially, CESAR module computes thermal hydraulics in primary and secondary circuits. As soon as core uncovery begins, the ICARE module is actuated based on certain parameters, and after this, ICARE module computes the thermal hydraulics in the core, bypass, downcomer and the lower plenum. CESAR handles the remaining components in the primary and secondary loops. CPA module is used to simulate the containment and to predict the thermal-hydraulic and hydrogen behaviour in the containment. The accident sequences were selected in such a way that they cover low/high pressure and slow/fast core damage progression events. Events simulated included slow progression events with high pressure and fast accident progression with low primary pressure. Analysis was also carried out for the case of SBO with the opening of the PORVs when core exit temperature exceeds certain value as part of SAMG. Time step sensitivity study was carried out for LOCA with SBO. In general the trends and magnitude of the parameters are as expected. The key results of the above analyses are presented in this paper. (author)

  13. Analyse de données en apprentissage d’une L2 en situation d’autonomie dans un environnement multimédia Analyse de données en apprentissage d’une L2 en situation d’autonomie dans un environnement multimédia

    Directory of Open Access Journals (Sweden)

    Lise Duquette

    2002-03-01

    Full Text Available Cet article propose un cadre méthodologique pour traiter de l’apprentissage et de l’évaluation en Enseignement des Langues Assisté par Ordinateur (ELAO dans le contexte de l’autonomie. En effet, en l’absence d’un paradigme qui fasse consensus, l’auteure propose une méthodologie d’analyse provenant de la résolution de problème en sciences cognitives et qui devrait être généralisable pour le traitement des données dans le contexte de l’autonomie en ELAO. Notre cadre d’analyse comprend trois stratégies générales correspondant aux connaissances préalables - régulation, mise en oeuvre, évaluation - qui permettent de distinguer les apprenant efficaces des non efficaces en situation d’autonomie puisqu’elles font appel aux connaissances antérieures. Ces stratégies générales sont croisées à six étapes de résolution de problème : lecture, analyse, exploration, planification, réalisation et vérification. L’auteure montre, à travers deux études empiriques qu’elle a menées, que le cadre de résolution de problème fonctionne bien et permet d’évaluer l’efficacité des apprenants en situation d’autonomie dans l’environnement de l’ELAO. Elle suggère que ce cadre de résolution de problème soit utilisé dans d’autres études sur l’apprentissage dans différents environnements informatiques afin de lui donner de la robustesse et une plus grande généralisation et ainsi, de permettre au domaine d’opter pour le paradigme cognitif qui deviendrait alors consensuel.This article suggests a methodological framework which can be used to study learning and evaluation in Computer-Assisted Language Learning (CALL within the context of learner autonomy. Since there is at present no widely agreed upon paradigm in the field, the author suggests a methodology based on problem solving in the cognitive sciences and suggests that this methodology is generalizable to analyses of data in the context of self

  14. A new ImageJ plug-in "ActogramJ" for chronobiological analyses.

    Science.gov (United States)

    Schmid, Benjamin; Helfrich-Förster, Charlotte; Yoshii, Taishi

    2011-10-01

    While the rapid development of personal computers and high-throughput recording systems for circadian rhythms allow chronobiologists to produce huge amounts of data, the software to analyze them often lags behind. Here, we announce newly developed chronobiology software that is easy to use, compatible with many different systems, and freely available. Our system can perform the most frequently used analyses: actogram drawing, periodogram analysis, and waveform analysis. The software is distributed as a pure Java plug-in for ImageJ and so works on the 3 main operating systems: Linux, Macintosh, and Windows. We believe that this free software raises the speed of data analyses and makes studying chronobiology accessible to newcomers. © 2011 The Author(s)

  15. Small calcified lesions suggestive of neurocysticercosis are associated with mesial temporal sclerosis

    Directory of Open Access Journals (Sweden)

    Marcos C. B. Oliveira

    2014-07-01

    Full Text Available Recent studies have suggested a possible relationship between temporal lobe epilepsy with mesial temporal sclerosis (MTS and neurocysticercosis (NC. We performed a case-control study to evaluate the association of NC and MTS. Method: We randomly selected patients with different epilepsy types, including: MTS, primary generalized epilepsy (PGE and focal symptomatic epilepsy (FSE. Patients underwent a structured interview, followed by head computed tomography (CT. A neuroradiologist evaluated the scan for presence of calcified lesions suggestive of NC. CT results were matched with patients’ data. Results: More patients in the MTS group displayed calcified lesions suggestive of NC than patients in the other groups (p=0.002. On multivariate analysis, MTS was found to be an independent predictor of one or more calcified NC lesions (p=0.033. Conclusion: After controlling for confounding factors, we found an independent association between NC calcified lesions and MTS.

  16. DISTANCE LEARNERSÕ PERCEPTIONS OF COMPUTER MEDIATED COMMUNICATION

    OpenAIRE

    Mujgan Bozkaya; Irem Erdem Aydin

    2011-01-01

    In this study, perspectives of the first year students in the completely online Information Management Associate Degree Program at Anadolu University regarding computer as a communication medium were investigated. StudentsÕ perspectives on computer-mediated communications were analyzed in the light of three different views in the area of computer-mediated communications: The first view suggests that face-to-face settings are better communication environments compared to computer-mediated envi...

  17. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  18. Responding to hypnotic and nonhypnotic suggestions: performance standards, imaginative suggestibility, and response expectancies.

    Science.gov (United States)

    Meyer, Eric C; Lynn, Steven Jay

    2011-07-01

    This study examined the relative impact of hypnotic inductions and several other variables on hypnotic and nonhypnotic responsiveness to imaginative suggestions. The authors examined how imaginative suggestibility, response expectancies, motivation to respond to suggestions, and hypnotist-induced performance standards affected participants' responses to both hypnotic and nonhypnotic suggestions and their suggestion-related experiences. Suggestions were administered to 5 groups of participants using a test-retest design: (a) stringent performance standards; (b) lenient performance standards; (c) hypnosis test-retest; (d) no-hypnosis test-retest; and (e) no-hypnosis/hypnosis control. The authors found no support for the influence of a hypnotic induction or performance standards on responding to suggestions but found considerable support for the role of imaginative suggestibility and response expectancies in predicting responses to both hypnotic and nonhypnotic suggestions.

  19. How computational models can help unlock biological systems.

    Science.gov (United States)

    Brodland, G Wayne

    2015-12-01

    With computation models playing an ever increasing role in the advancement of science, it is important that researchers understand what it means to model something; recognize the implications of the conceptual, mathematical and algorithmic steps of model construction; and comprehend what models can and cannot do. Here, we use examples to show that models can serve a wide variety of roles, including hypothesis testing, generating new insights, deepening understanding, suggesting and interpreting experiments, tracing chains of causation, doing sensitivity analyses, integrating knowledge, and inspiring new approaches. We show that models can bring together information of different kinds and do so across a range of length scales, as they do in multi-scale, multi-faceted embryogenesis models, some of which connect gene expression, the cytoskeleton, cell properties, tissue mechanics, morphogenetic movements and phenotypes. Models cannot replace experiments nor can they prove that particular mechanisms are at work in a given situation. But they can demonstrate whether or not a proposed mechanism is sufficient to produce an observed phenomenon. Although the examples in this article are taken primarily from the field of embryo mechanics, most of the arguments and discussion are applicable to any form of computational modelling. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  20. Translator-computer interaction in action

    DEFF Research Database (Denmark)

    Bundgaard, Kristine; Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    perspective, this paper investigates the relationship between machines and humans in the field of translation, analysing a CAT process in which machine-translation (MT) technology was integrated into a translation-memory (TM) suite. After a review of empirical research into the impact of CAT tools......Though we lack empirically-based knowledge of the impact of computer-aided translation (CAT) tools on translation processes, it is generally agreed that all professional translators are now involved in some kind of translator-computer interaction (TCI), using O’Brien’s (2012) term. Taking a TCI......, the study indicates that the tool helps the translator conform to project and customer requirements....

  1. Computational studies of tokamak plasmas

    International Nuclear Information System (INIS)

    Takizuka, Tomonori; Tsunematsu, Toshihide; Tokuda, Shinji

    1981-02-01

    Computational studies of tokamak plasmas are extensively advanced. Many computational codes have been developed by using several kinds of models, i.e., the finite element formulation of MHD equations, the time dependent multidimensional fluid model, and the particle model with the Monte-Carlo method. These codes are applied to the analyses of the equilibrium of an axisymmetric toroidal plasma (SELENE), the time evolution of the high-beta tokamak plasma (APOLLO), the low-n MHD stability (ERATO-J) and high-n ballooning mode stability (BOREAS) in the INTOR tokamak, the nonlinear MHD stability, such as the positional instability (AEOLUS-P), resistive internal mode (AEOLUS-I) etc., and the divertor functions. (author)

  2. Biochemical and computational analyses of two phenotypically related GALT mutations (S222N and S135L that lead to atypical galactosemia

    Directory of Open Access Journals (Sweden)

    Benjamin Cocanougher

    2015-06-01

    Full Text Available Galactosemia is a metabolic disorder caused by mutations in the GALT gene [1,2]. We encountered a patient heterozygous for a known pathogenic H132Q mutation and a novel S222N variant of unknown significance [3]. Reminiscent of patients with the S135L mutation, our patient had loss of GALT enzyme activity in erythrocytes but a very mild clinical phenotype [3–8]. We performed splicing experiments and computational structural analyses to investigate the role of the novel S222N variant. Alamut software data predicted loss of splicing enhancers for the S222N and S135L mutations [9,10]. A cDNA library was generated from our patient׳s RNA to investigate for splicing errors, but no change in transcript length was seen [3]. In silico structural analysis was performed to investigate enzyme stability and attempt to understand the mechanism of the atypical galactosemia phenotype. Stability results are publicly available in the GALT Protein Database 2.0 [11–14]. Animations were created to give the reader a dynamic view of the enzyme structure and mutation locations. Protein database files and python scripts are included for further investigation.

  3. Computer code for quantitative ALARA evaluations

    International Nuclear Information System (INIS)

    Voilleque, P.G.

    1984-01-01

    A FORTRAN computer code has been developed to simplify the determination of whether dose reduction actions meet the as low as is reasonably achievable (ALARA) criterion. The calculations are based on the methodology developed for the Atomic Industrial Forum. The code is used for analyses of eight types of dose reduction actions, characterized as follows: reduce dose rate, reduce job frequency, reduce productive working time, reduce crew size, increase administrative dose limit for the task, and increase the workers' time utilization and dose utilization through (a) improved working conditions, (b) basic skill training, or (c) refresher training for special skills. For each type of action, two analysis modes are available. The first is a generic analysis in which the program computes potential benefits (in dollars) for a range of possible improvements, e.g., for a range of lower dose rates. Generic analyses are most useful in the planning stage and for evaluating the general feasibility of alternative approaches. The second is a specific analysis in which the potential annual benefits of a specific level of improvement and the annual implementation cost are compared. The potential benefits reflect savings in operational and societal costs that can be realized if occupational radiation doses are reduced. Because the potential benefits depend upon many variables which characterize the job, the workplace, and the workers, there is no unique relationship between the potential dollar savings and the dose savings. The computer code permits rapid quantitative analyses of alternatives and is a tool that supplements the health physicist's professional judgment. The program output provides a rational basis for decision-making and a record of the assumptions employed

  4. Suicidality and interrogative suggestibility.

    Science.gov (United States)

    Pritchard-Boone, Lea; Range, Lillian M

    2005-01-01

    All people are subject to memory suggestibility, but suicidal individuals may be especially so. The link between suicidality and suggestibility is unclear given mixed findings and methodological weaknesses of past research. To test the link between suicidality and interrogative suggestibility, 149 undergraduates answered questions about suicidal thoughts and reasons for living, and participated in a direct suggestibility procedure. As expected, suggestibility correlated with suicidality but accounted for little overall variance (4%). Mental health professionals might be able to take advantage of client suggestibility by directly telling suicidal persons to refrain from suicidal thoughts or actions.

  5. Benchmarking severe accident computer codes for heavy water reactor applications

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.H. [International Atomic Energy Agency, Vienna (Austria)

    2010-07-01

    Consideration of severe accidents at a nuclear power plant (NPP) is an essential component of the defence in depth approach used in nuclear safety. Severe accident analysis involves very complex physical phenomena that occur sequentially during various stages of accident progression. Computer codes are essential tools for understanding how the reactor and its containment might respond under severe accident conditions. International cooperative research programmes are established by the IAEA in areas that are of common interest to a number of Member States. These co-operative efforts are carried out through coordinated research projects (CRPs), typically 3 to 6 years in duration, and often involving experimental activities. Such CRPs allow a sharing of efforts on an international basis, foster team-building and benefit from the experience and expertise of researchers from all participating institutes. The IAEA is organizing a CRP on benchmarking severe accident computer codes for heavy water reactor (HWR) applications. The CRP scope includes defining the severe accident sequence and conducting benchmark analyses for HWRs, evaluating the capabilities of existing computer codes to predict important severe accident phenomena, and suggesting necessary code improvements and/or new experiments to reduce uncertainties. The CRP has been planned on the advice and with the support of the IAEA Nuclear Energy Department's Technical Working Groups on Advanced Technologies for HWRs. (author)

  6. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  7. Metagenomic analyses of bacteria on human hairs: a qualitative assessment for applications in forensic science.

    Science.gov (United States)

    Tridico, Silvana R; Murray, Dáithí C; Addison, Jayne; Kirkbride, Kenneth P; Bunce, Michael

    2014-01-01

    Mammalian hairs are one of the most ubiquitous types of trace evidence collected in the course of forensic investigations. However, hairs that are naturally shed or that lack roots are problematic substrates for DNA profiling; these hair types often contain insufficient nuclear DNA to yield short tandem repeat (STR) profiles. Whilst there have been a number of initial investigations evaluating the value of metagenomics analyses for forensic applications (e.g. examination of computer keyboards), there have been no metagenomic evaluations of human hairs-a substrate commonly encountered during forensic practice. This present study attempts to address this forensic capability gap, by conducting a qualitative assessment into the applicability of metagenomic analyses of human scalp and pubic hair. Forty-two DNA extracts obtained from human scalp and pubic hairs generated a total of 79,766 reads, yielding 39,814 reads post control and abundance filtering. The results revealed the presence of unique combinations of microbial taxa that can enable discrimination between individuals and signature taxa indigenous to female pubic hairs. Microbial data from a single co-habiting couple added an extra dimension to the study by suggesting that metagenomic analyses might be of evidentiary value in sexual assault cases when other associative evidence is not present. Of all the data generated in this study, the next-generation sequencing (NGS) data generated from pubic hair held the most potential for forensic applications. Metagenomic analyses of human hairs may provide independent data to augment other forensic results and possibly provide association between victims of sexual assault and offender when other associative evidence is absent. Based on results garnered in the present study, we believe that with further development, bacterial profiling of hair will become a valuable addition to the forensic toolkit.

  8. Numerically- analysed Multiwavelet Transform computations ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... The multiwavelet has two-scale : Research ... two wavelets functions beginning from two box functions shown in .... compression for multi-view auto-stereoscopic displays,” in ... [10] T. Kesavamurthy and S. Rani, “Dicom Color.

  9. General-purpose parallel simulator for quantum computing

    International Nuclear Information System (INIS)

    Niwa, Jumpei; Matsumoto, Keiji; Imai, Hiroshi

    2002-01-01

    With current technologies, it seems to be very difficult to implement quantum computers with many qubits. It is therefore of importance to simulate quantum algorithms and circuits on the existing computers. However, for a large-size problem, the simulation often requires more computational power than is available from sequential processing. Therefore, simulation methods for parallel processors are required. We have developed a general-purpose simulator for quantum algorithms/circuits on the parallel computer (Sun Enterprise4500). It can simulate algorithms/circuits with up to 30 qubits. In order to test efficiency of our proposed methods, we have simulated Shor's factorization algorithm and Grover's database search, and we have analyzed robustness of the corresponding quantum circuits in the presence of both decoherence and operational errors. The corresponding results, statistics, and analyses are presented in this paper

  10. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  11. Analysis of the reasons of recently some radioactive source accidents and suggestions for management countermeasures

    International Nuclear Information System (INIS)

    Su Yongjie; Feng Youcai; Song Chenxiu; Gao Huibin; Xing Jinsong; Pang Xinxin; Wang Xiaoqing; Wei Hong

    2007-01-01

    The article introduces recently some radioactive source accidents in China, and analyses the reasons of the accidents. Some important issues existed in the process of implementing new regulation were summarized, and some suggestions for managing radioactive sources are made. (authors)

  12. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  13. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  14. A Developmental Scale of Mental Computation with Part-Whole Numbers

    Science.gov (United States)

    Callingham, Rosemary; Watson, Jane

    2004-01-01

    In this article, data from a study of the mental computation competence of students in grades 3 to 10 are presented. Students responded to mental computation items, presented orally, that included operations applied to fractions, decimals and percents. The data were analysed using Rasch modelling techniques, and a six-level hierarchy of part-whole…

  15. Accurate measurement of surface areas of anatomical structures by computer-assisted triangulation of computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Allardice, J.T.; Jacomb-Hood, J.; Abulafi, A.M.; Williams, N.S. (Royal London Hospital (United Kingdom)); Cookson, J.; Dykes, E.; Holman, J. (London Hospital Medical College (United Kingdom))

    1993-05-01

    There is a need for accurate surface area measurement of internal anatomical structures in order to define light dosimetry in adjunctive intraoperative photodynamic therapy (AIOPDT). The authors investigated whether computer-assisted triangulation of serial sections generated by computed tomography (CT) scanning can give an accurate assessment of the surface area of the walls of the true pelvis after anterior resection and before colorectal anastomosis. They show that the technique of paper density tessellation is an acceptable method of measuring the surface areas of phantom objects, with a maximum error of 0.5%, and is used as the gold standard. Computer-assisted triangulation of CT images of standard geometric objects and accurately-constructed pelvic phantoms gives a surface area assessment with a maximum error of 2.5% compared with the gold standard. The CT images of 20 patients' pelves have been analysed by computer-assisted triangulation and this shows the surface area of the walls varies from 143 cm[sup 2] to 392 cm[sup 2]. (Author).

  16. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  17. New Computer Terms in Bloggers’ Language

    Directory of Open Access Journals (Sweden)

    Vilija Celiešienė

    2012-06-01

    Full Text Available The article presents an analysis of new words in computer terminology that make their way to blogs and analyzes how the official neologisms and computer terms, especially the equivalents to barbarisms, are employed in everyday use. The article also discusses the ways of including the new computer terms into texts. The blogs on topics of information technology are the objects of the research. The analysis of the aforementioned blogs allowed highlighting certain trends in the use of new computer terms. An observation was made that even though the authors of the blogs could freely choose their writing style, they were not bound by the standards of literary language. Thus, their language was full of non-standard vocabulary; however, self-control regarding the language used could still be noticed. An interest in novelties of computer terminology and the tendency to accept some of the suggested new Lithuanian and loaned computer terms were noticed. When using the new words the bloggers frequently employed specific graphical elements and (or comments. The graphical elements were often chosen by bloggers to express their feelings of doubt regarding the suitability of the use of the suggested loanword. Attempting to explain the meaning of the new word to the readers the bloggers tended to post comments about the new computer terms.

  18. Heterogeneous compute in computer vision: OpenCL in OpenCV

    Science.gov (United States)

    Gasparakis, Harris

    2014-02-01

    We explore the relevance of Heterogeneous System Architecture (HSA) in Computer Vision, both as a long term vision, and as a near term emerging reality via the recently ratified OpenCL 2.0 Khronos standard. After a brief review of OpenCL 1.2 and 2.0, including HSA features such as Shared Virtual Memory (SVM) and platform atomics, we identify what genres of Computer Vision workloads stand to benefit by leveraging those features, and we suggest a new mental framework that replaces GPU compute with hybrid HSA APU compute. As a case in point, we discuss, in some detail, popular object recognition algorithms (part-based models), emphasizing the interplay and concurrent collaboration between the GPU and CPU. We conclude by describing how OpenCL has been incorporated in OpenCV, a popular open source computer vision library, emphasizing recent work on the Transparent API, to appear in OpenCV 3.0, which unifies the native CPU and OpenCL execution paths under a single API, allowing the same code to execute either on CPU or on a OpenCL enabled device, without even recompiling.

  19. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  20. A real-time transfer function analyser program for PFR

    International Nuclear Information System (INIS)

    McWilliam, D.

    1980-03-01

    A transfer function analyser software package has been produced which is believed to constitute a significant advance over others reported in the literature. The main advantages of the system are its operating speed, especially at low frequencies, which is due to its use of part-cycle integration and its high degree of interactive operator control. The driving sine wave, the return signals and the computed vector diagrams are displayed on TV type visual display units. Data output is by means of an incremental graph plotter or an IBM typewriter. (author)

  1. Analyses of transient plant response under emergency situations. 2

    International Nuclear Information System (INIS)

    Koyama, Kazuya; Hishida, Masahiko

    2000-03-01

    In order to support development of the dynamic reliability analysis program DYANA, analyses were made on the event sequences anticipated under emergency situations using the plant dynamics simulation computer code Super-COPD. In this work 9 sequences were analyzed and integrated into an input file for preparing the functions for DYANA using the analytical model and input data which developed for Super-COPD in the previous work. These sequences could not analyze in the previous work, which were categorized into the PLOHS (Protected Loss of Heat Sink) event. (author)

  2. Heat Transfer treatment in computer codes for safety analysis

    International Nuclear Information System (INIS)

    Jerele, A.; Gregoric, M.

    1984-01-01

    Increased number of operating nuclear power plants has stressed importance of nuclear safety evaluation. For this reason, accordingly to regulatory commission request, safety analyses with computer codes are preformed. In this paper part of this thermohydraulic models dealing with wall-to-fluid heat transfer correlations in computer codes TRAC=PF1, RELAP4/MOD5, RELAP5/MOD1 and COBRA-IV is discussed. (author)

  3. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  4. Computing several eigenpairs of Hermitian problems by conjugate gradient iterations

    International Nuclear Information System (INIS)

    Ovtchinnikov, E.E.

    2008-01-01

    The paper is concerned with algorithms for computing several extreme eigenpairs of Hermitian problems based on the conjugate gradient method. We analyse computational strategies employed by various algorithms of this kind reported in the literature and identify their limitations. Our criticism is illustrated by numerical tests on a set of problems from electronic structure calculations and acoustics

  5. Genome-wide and expression-profiling analyses suggest the main cytochrome P450 genes related to pyrethroid resistance in the malaria vector, Anopheles sinensis (Diptera Culicidae).

    Science.gov (United States)

    Yan, Zheng-Wen; He, Zheng-Bo; Yan, Zhen-Tian; Si, Feng-Ling; Zhou, Yong; Chen, Bin

    2018-02-02

    Anopheles sinensis is one of the major malaria vectors. However, pyrethroid resistance in An. sinensis is threatening malaria control. Cytochrome P450-mediated detoxification is an important pyrethroid resistance mechanism that has been unexplored in An. sinensis. In this study, we performed a comprehensive analysis of the An. sinensis P450 gene superfamily with special attention to their role in pyrethroid resistance using bioinformatics and molecular approaches. Our data revealed the presence of 112 individual P450 genes in An. sinensis, which were classified into four major clans (mitochondrial, CYP2, CYP3 and CYP4), 18 families and 50 subfamilies. Sixty-seven genes formed nine gene clusters, and genes within the same cluster and the same gene family had a similar gene structure. Phylogenetic analysis showed that most of An. sinensis P450s (82/112) had very close 1: 1 orthology with Anopheles gambiae P450s. Five genes (AsCYP6Z2, AsCYP6P3v1, AsCYP6P3v2, AsCYP9J5 and AsCYP306A1) were significantly upregulated in three pyrethroid-resistant populations in both RNA-seq and RT-qPCR analyses, suggesting that they could be the most important P450 genes involved in pyrethroid resistance in An. sinensis. Our study provides insight on the diversity of An. sinensis P450 superfamily and basis for further elucidating pyrethroid resistance mechanism in this mosquito species. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  6. Computation of water hammer protection of modernized pumping station

    Directory of Open Access Journals (Sweden)

    Himr Daniel

    2014-03-01

    Finally, the pump trip was performed to verify if the system worked correctly. The test showed that pressure pulsations are lower (better than computation predicted. This discrepancy was further analysed.

  7. Complex of two-dimensional multigroup programs for neutron-physical computations of nuclear reactor

    International Nuclear Information System (INIS)

    Karpov, V.A.; Protsenko, A.N.

    1975-01-01

    Briefly stated mathematical aspects of the two-dimensional multigroup method of neutron-physical computation of nuclear reactor. Problems of algorithmization and BESM-6 computer realisation of multigroup diffuse approximations in hexagonal and rectangular calculated lattices are analysed. The results of computation of fast critical assembly having complicated composition of the core are given. The estimation of computation accuracy of criticality, neutron fields distribution and efficiency of absorbing rods by means of computer programs developed is done. (author)

  8. Immunotoxicity of nanoparticles: a computational study suggests that CNTs and C60 fullerenes might be recognized as pathogens by Toll-like receptors

    Science.gov (United States)

    Turabekova, M.; Rasulev, B.; Theodore, M.; Jackman, J.; Leszczynska, D.; Leszczynski, J.

    2014-03-01

    Over the last decade, a great deal of attention has been devoted to study the inflammatory response upon exposure to multi/single-walled carbon nanotubes (CNTs) and different fullerene derivatives. In particular, carbon nanoparticles are reported to provoke substantial inflammation in alveolar and bronchial epithelial cells, epidermal keratinocytes, cultured monocyte-macrophage cells, etc. We suggest a hypothetical model providing the potential mechanistic explanation for immune and inflammatory responses observed upon exposure to carbon nanoparticles. Specifically, we performed a theoretical study to analyze CNT and C60 fullerene interactions with the available X-ray structures of Toll-like receptors (TLRs) homo- and hetero-dimer extracellular domains. This assumption was based on the fact that similar to the known TLR ligands both CNTs and fullerenes induce, in cells, the secretion of certain inflammatory protein mediators, such as interleukins and chemokines. These proteins are observed within inflammation downstream processes resulted from the ligand molecule dependent inhibition or activation of TLR-induced signal transduction. Our computational studies have shown that the internal hydrophobic pockets of some TLRs might be capable of binding small-sized carbon nanostructures (5,5 armchair SWCNTs containing 11 carbon atom layers and C60 fullerene). High binding scores and minor structural alterations induced in TLR ectodomains upon binding C60 and CNTs further supported our hypothesis. Additionally, the proposed hypothesis is strengthened by the indirect experimental findings indicating that CNTs and fullerenes induce an excessive expression of specific cytokines and chemokines (i.e. IL-8 and MCP1).Over the last decade, a great deal of attention has been devoted to study the inflammatory response upon exposure to multi/single-walled carbon nanotubes (CNTs) and different fullerene derivatives. In particular, carbon nanoparticles are reported to provoke

  9. Skills and the appreciation of computer art

    Science.gov (United States)

    Boden, Margaret A.

    2016-04-01

    The appreciation of art normally includes recognition of the artist's skills in making it. Most people cannot appreciate computer art in that way, because they know little or nothing about coding. Various suggestions are made about how computer artists and/or curators might design and present computer art in such a way as to make the relevant making-skills more intelligible.

  10. Use of electronic computers for processing of spectrometric data in instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.

    1977-01-01

    A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered

  11. TV time but not computer time is associated with cardiometabolic risk in Dutch young adults.

    Science.gov (United States)

    Altenburg, Teatske M; de Kroon, Marlou L A; Renders, Carry M; Hirasing, Remy; Chinapaw, Mai J M

    2013-01-01

    TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Data of 634 Dutch young adults (18-28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time.

  12. Emergency response guide-B ECCS guideline evaluation analyses for N reactor

    International Nuclear Information System (INIS)

    Chapman, J.C.; Callow, R.A.

    1989-07-01

    INEL conducted two ECCS analyses for Westinghouse Hanford. Both analyses will assist in the evaluation of proposed changes to the N Reactor Emergency Response Guide-B (ERG-B) Emergency Core System (ECCS) guideline. The analyses were a sensitivity study for reduced-ECCS flow rates and a mechanistically determined confinement steam source for a delayed-ECCS LOCA sequence. The reduced-ECCS sensitivity study established the maximum allowable reduction in ECCS flow as a function of time after core refill for a large break loss-of-coolant accident (LOCA) sequence in the N Reactor. The maximum allowable ECCS flow reduction is defined as the maximum flow reduction for which ECCS continues to provide adequate core cooling. The delayed-ECCS analysis established the liquid and steam break flows and enthalpies during the reflood of a hot core following a delayed ECCS injection LOCA sequence. A simulation of a large, hot leg manifold break with a seven-minute ECCS injection delay was used as a representative LOCA sequence. Both analyses were perform using the RELAP5/MOD2.5 transient computer code. 13 refs., 17 figs., 3 tabs

  13. Computer Aided Design in FE. Some Suggestions on the Inclusion of CAD Topics in Mechanical Engineering Courses. An Occasional Paper.

    Science.gov (United States)

    Ingham, P. C.

    This report investigates the feasibility of including computer aided design (CAD) materials in engineering courses. Section 1 briefly discusses the inevitability of CAD being adopted widely by British industry and the consequent need for its inclusion in engineering syllabi at all levels. A short description of what is meant by CAD follows in…

  14. PALSfit3: A software package for analysing positron lifetime spectra

    DEFF Research Database (Denmark)

    Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard

    The present report describes a Windows based computer program called PALSfit3. The purpose of the program is to carry out analyses of spectra that have been measured by positron annihilation lifetime spectroscopy (PALS). PALSfit3 is based on the well tested PATFIT and PALS fit programs, which hav...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)...

  15. Some neutronics and thermal-hydraulics codes for reactor analysis using personal computers

    International Nuclear Information System (INIS)

    Woodruff, W.L.

    1990-01-01

    Some neutronics and thermal-hydraulics codes formerly available only for main frame computers may now be run on personal computers. Brief descriptions of the codes are provided. Running times for some of the codes are compared for an assortment of personal and main frame computers. With some limitations in detail, personal computer versions of the codes can be used to solve many problems of interest in reactor analyses at very modest costs. 11 refs., 4 tabs

  16. Are Dysphoric Individuals More Suggestible or Less Suggestible Than Nondysphoric Individuals?

    Science.gov (United States)

    MacFarland, Wendy L.; Morris, Steven J.

    1998-01-01

    Dysphoric individuals are shown to be susceptible to interrogative suggestion, whether in the form of leading questions or interrogative pressure. The association of a clinically relevant condition of dysphoria (depression) with relatively high levels of suggestibility was investigated in a college student population (N=139). Applicability to…

  17. The Efficient Use of Vector Computers with Emphasis on Computational Fluid Dynamics : a GAMM-Workshop

    CERN Document Server

    Gentzsch, Wolfgang

    1986-01-01

    The GAMM Committee for Numerical Methods in Fluid Mechanics organizes workshops which should bring together experts of a narrow field of computational fluid dynamics (CFD) to exchange ideas and experiences in order to speed-up the development in this field. In this sense it was suggested that a workshop should treat the solution of CFD problems on vector computers. Thus we organized a workshop with the title "The efficient use of vector computers with emphasis on computational fluid dynamics". The workshop took place at the Computing Centre of the University of Karlsruhe, March 13-15,1985. The participation had been restricted to 22 people of 7 countries. 18 papers have been presented. In the announcement of the workshop we wrote: "Fluid mechanics has actively stimulated the development of superfast vector computers like the CRAY's or CYBER 205. Now these computers on their turn stimulate the development of new algorithms which result in a high degree of vectorization (sca1ar/vectorized execution-time). But w...

  18. [Computer program "PANCREAS"].

    Science.gov (United States)

    Jakubowicz, J; Jankowski, M; Szomański, B; Switka, S; Zagórowicz, E; Pertkiewicz, M; Szczygieł, B

    1998-01-01

    Contemporary computer technology allows precise and fast large database analysis. Widespread and common use depends on appropriate, user friendly software, usually lacking in special medical applications. The aim of this work was to develop an integrated system designed to store, explore and analyze data of patients treated for pancreatic cancer. For that purpose the database administration system MS Visual Fox Pro 3.0 was used and special application, according to ISO 9000 series has been developed. The system works under MS Windows 95 with possibility of easy adaptation to MS Windows 3.11 or MS Windows NT by graphic user's interface. The system stores personal data, laboratory results, visual and histological analyses and information on treatment course and complications. However the system archives them and enables the preparation reports of according to individual and statistical needs. Help and security settings allow to work also for one not familiar with computer science.

  19. Input/output routines for a hybrid computer

    International Nuclear Information System (INIS)

    Izume, Akitada; Yodo, Terutaka; Sakama, Iwao; Sakamoto, Akira; Miyake, Osamu

    1976-05-01

    This report is concerned with data processing programs for a hybrid computer system. Especially pre-data processing of magnetic tapes which are recorded during the dynamic experiment by FACOM 270/25 data logging system in the 50 MW steam generator test facility is described in detail. The magnetic tape is a most effective recording medium for data logging, but recording formats of the magnetic tape are different between data logging systems. In our section, the final data analyses are performed by data in the disk of EAI-690 hybrid computer system, and to transfer all required information in magnetic tapes to the disk, the magnetic tape editing and data transit are necessary by sub-computer NEAC-3200 system. This report is written for users as a manual and reference hand book of pre-data processing between different type computers. (auth.)

  20. Physicist or computer specialist?

    Energy Technology Data Exchange (ETDEWEB)

    Clifton, J S [University College Hospital, London (United Kingdom)

    1966-06-15

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction.

  1. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  2. Amdel on-line analyser at Rooiberg Tin Limited

    International Nuclear Information System (INIS)

    Owen, T.V.

    1987-01-01

    An Amdel on line analysis system was installed on the 'A' mine tin flotation plant at Rooiberg in April 1984. The motivation for the installation was made on account of the large variations in the feed grade to the plant and the resulting need for rapid operational adjustments to control concentrate grades thereby maximising the financial returns. An 'on-line' analyser system presented itself as a suitable alternative to the existing control method of smaller laboratory x-ray fluorescence analysers. On the system as installed at Rooiberg, two probes were fitted in each analysis zone, viz a density probe using high energy gamma radiation from a Cesium 127 source and a specific element absorption probe using low energy gamma radiation from a Americium 241 source. The signals as received from the probes are fed to a line receiver unit in the control room where a micro computer is doing the processing and prints out the information as required. Several advantages of this type of installation were gained at Rooiberg Tin Limited

  3. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  4. A Privacy-by-Design Contextual Suggestion System for Tourism

    Directory of Open Access Journals (Sweden)

    Pavlos S. Efraimidis

    2016-05-01

    Full Text Available We focus on personal data generated by the sensors and through the everyday usage of smart devices and take advantage of these data to build a non-invasive contextual suggestion system for tourism. The system, which we call Pythia, exploits the computational capabilities of modern smart devices to offer high quality personalized POI (point of interest recommendations. To protect user privacy, we apply a privacy by design approach within all of the steps of creating Pythia. The outcome is a system that comprises important architectural and operational innovations. The system is designed to process sensitive personal data, such as location traces, browsing history and web searches (query logs, to automatically infer user preferences and build corresponding POI-based user profiles. These profiles are then used by a contextual suggestion engine to anticipate user choices and make POI recommendations for tourists. Privacy leaks are minimized by implementing an important part of the system functionality at the user side, either as a mobile app or as a client-side web application, and by taking additional precautions, like data generalization, wherever necessary. As a proof of concept, we present a prototype that implements the aforementioned mechanisms on the Android platform accompanied with certain web applications. Even though the current prototype focuses only on location data, the results from the evaluation of the contextual suggestion algorithms and the user experience feedback from volunteers who used the prototype are very positive.

  5. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    Science.gov (United States)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer

  6. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  7. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  8. Computational Physics as a Path for Physics Education

    Science.gov (United States)

    Landau, Rubin H.

    2008-04-01

    Evidence and arguments will be presented that modifications in the undergraduate physics curriculum are necessary to maintain the long-term relevance of physics. Suggested will a balance of analytic, experimental, computational, and communication skills, that in many cases will require an increased inclusion of computation and its associated skill set into the undergraduate physics curriculum. The general arguments will be followed by a detailed enumeration of suggested subjects and student learning outcomes, many of which have already been adopted or advocated by the computational science community, and which permit high performance computing and communication. Several alternative models for how these computational topics can be incorporated into the undergraduate curriculum will be discussed. This includes enhanced topics in the standard existing courses, as well as stand-alone courses. Applications and demonstrations will be presented throughout the talk, as well as prototype video-based materials and electronic books.

  9. Simultaneous detection of P300 and steady-state visually evoked potentials for hybrid brain-computer interface.

    Science.gov (United States)

    Combaz, Adrien; Van Hulle, Marc M

    2015-01-01

    We study the feasibility of a hybrid Brain-Computer Interface (BCI) combining simultaneous visual oddball and Steady-State Visually Evoked Potential (SSVEP) paradigms, where both types of stimuli are superimposed on a computer screen. Potentially, such a combination could result in a system being able to operate faster than a purely P300-based BCI and encode more targets than a purely SSVEP-based BCI. We analyse the interactions between the brain responses of the two paradigms, and assess the possibility to detect simultaneously the brain activity evoked by both paradigms, in a series of 3 experiments where EEG data are analysed offline. Despite differences in the shape of the P300 response between pure oddball and hybrid condition, we observe that the classification accuracy of this P300 response is not affected by the SSVEP stimulation. We do not observe either any effect of the oddball stimulation on the power of the SSVEP response in the frequency of stimulation. Finally results from the last experiment show the possibility of detecting both types of brain responses simultaneously and suggest not only the feasibility of such hybrid BCI but also a gain over pure oddball- and pure SSVEP-based BCIs in terms of communication rate.

  10. Feedback Loops in Communication and Human Computing

    NARCIS (Netherlands)

    op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas S.

    Building systems that are able to analyse communicative behaviours or take part in conversations requires a sound methodology in which the complex organisation of conversations is understood and tested on real-life samples. The data-driven approaches to human computing not only have a value for the

  11. Computational Psychiatry and the Challenge of Schizophrenia

    Science.gov (United States)

    Murray, John D.; Chekroud, Adam M.; Corlett, Philip R.; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan

    2017-01-01

    Abstract Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. PMID:28338845

  12. Computer aided probabilistic assessment of containment integrity

    International Nuclear Information System (INIS)

    Tsai, J.C.; Touchton, R.A.

    1984-01-01

    In the probabilistic risk assessment (PRA) of a nuclear power plant, there are three probability-based techniques which are widely used for event sequence frequency quantification (including nodal probability estimation). These three techniques are the event tree analysis, the fault tree analysis and the Bayesian approach for database development. In the barrier analysis for assessing radionuclide release to the environment in a PRA study, these techniques are employed to a greater extent in estimating conditions which could lead to failure of the fuel cladding and the reactor coolant system (RCS) pressure boundary, but to a lesser degree in the containment pressure boundary failure analysis. The main reason is that containment issues are currently still in a state of flux. In this paper, the authors describe briefly the computer programs currently used by the nuclear industry to do event tree analyses, fault tree analyses and the Bayesian update. The authors discuss how these computer aided probabilistic techniques might be adopted for failure analysis of the containment pressure boundary

  13. Analyses, algorithms, and computations for models of high-temperature superconductivity. Final report

    International Nuclear Information System (INIS)

    Du, Q.

    1997-01-01

    Under the sponsorship of the Department of Energy, the authors have achieved significant progress in the modeling, analysis, and computation of superconducting phenomena. The work so far has focused on mezoscale models as typified by the celebrated Ginzburg-Landau equations; these models are intermediate between the microscopic models (that can be used to understand the basic structure of superconductors and of the atomic and sub-atomic behavior of these materials) and the macroscale, or homogenized, models (that can be of use for the design of devices). The models they have considered include a time dependent Ginzburg-Landau model, a variable thickness thin film model, models for high values of the Ginzburg-landau parameter, models that account for normal inclusions and fluctuations and Josephson effects, and the anisotropic ginzburg-Landau and Lawrence-Doniach models for layered superconductors, including those with high critical temperatures. In each case, they have developed or refined the models, derived rigorous mathematical results that enhance the state of understanding of the models and their solutions, and developed, analyzed, and implemented finite element algorithms for the approximate solution of the model equations

  14. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  15. Target gene analyses of 39 amelogenesis imperfecta kindreds

    Science.gov (United States)

    Chan, Hui-Chen; Estrella, Ninna M. R. P.; Milkovich, Rachel N.; Kim, Jung-Wook; Simmer, James P.; Hu, Jan C-C.

    2012-01-01

    Previously, mutational analyses identified six disease-causing mutations in 24 amelogenesis imperfecta (AI) kindreds. We have since expanded the number of AI kindreds to 39, and performed mutation analyses covering the coding exons and adjoining intron sequences for the six proven AI candidate genes [amelogenin (AMELX), enamelin (ENAM), family with sequence similarity 83, member H (FAM83H), WD repeat containing domain 72 (WDR72), enamelysin (MMP20), and kallikrein-related peptidase 4 (KLK4)] and for ameloblastin (AMBN) (a suspected candidate gene). All four of the X-linked AI families (100%) had disease-causing mutations in AMELX, suggesting that AMELX is the only gene involved in the aetiology of X-linked AI. Eighteen families showed an autosomal-dominant pattern of inheritance. Disease-causing mutations were identified in 12 (67%): eight in FAM83H, and four in ENAM. No FAM83H coding-region or splice-junction mutations were identified in three probands with autosomal-dominant hypocalcification AI (ADHCAI), suggesting that a second gene may contribute to the aetiology of ADHCAI. Six families showed an autosomal-recessive pattern of inheritance, and disease-causing mutations were identified in three (50%): two in MMP20, and one in WDR72. No disease-causing mutations were found in 11 families with only one affected member. We conclude that mutation analyses of the current candidate genes for AI have about a 50% chance of identifying the disease-causing mutation in a given kindred. PMID:22243262

  16. Analysing and Rationalising Molecular and Materials Databases Using Machine-Learning

    Science.gov (United States)

    de, Sandip; Ceriotti, Michele

    Computational materials design promises to greatly accelerate the process of discovering new or more performant materials. Several collaborative efforts are contributing to this goal by building databases of structures, containing between thousands and millions of distinct hypothetical compounds, whose properties are computed by high-throughput electronic-structure calculations. The complexity and sheer amount of information has made manual exploration, interpretation and maintenance of these databases a formidable challenge, making it necessary to resort to automatic analysis tools. Here we will demonstrate how, starting from a measure of (dis)similarity between database items built from a combination of local environment descriptors, it is possible to apply hierarchical clustering algorithms, as well as dimensionality reduction methods such as sketchmap, to analyse, classify and interpret trends in molecular and materials databases, as well as to detect inconsistencies and errors. Thanks to the agnostic and flexible nature of the underlying metric, we will show how our framework can be applied transparently to different kinds of systems ranging from organic molecules and oligopeptides to inorganic crystal structures as well as molecular crystals. Funded by National Center for Computational Design and Discovery of Novel Materials (MARVEL) and Swiss National Science Foundation.

  17. Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence

    Science.gov (United States)

    Bayet, Laurie; Pascalis, Olivier; Quinn, Paul C.; Lee, Kang; Gentaz, Édouard; Tanaka, James W.

    2015-01-01

    Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5–6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1–2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. PMID:25859238

  18. Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence

    Directory of Open Access Journals (Sweden)

    Laurie eBayet

    2015-03-01

    Full Text Available Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization towards male responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood as revealed by signal detection analyses (Experiments 1-2. The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g. observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3, we further conclude that (1 the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2 any single choice of computational representation (e.g., Principal Component Analysis is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.

  19. Workstation computer systems for in-core fuel management

    International Nuclear Information System (INIS)

    Ciccone, L.; Casadei, A.L.

    1992-01-01

    The advancement of powerful engineering workstations has made it possible to have thermal-hydraulics and accident analysis computer programs operating efficiently with a significant performance/cost ratio compared to large mainframe computer. Today, nuclear utilities are acquiring independent engineering analysis capability for fuel management and safety analyses. Computer systems currently available to utility organizations vary widely thus requiring that this software be operational on a number of computer platforms. Recognizing these trends Westinghouse adopted a software development life cycle process for the software development activities which strictly controls the development, testing and qualification of design computer codes. In addition, software standards to ensure maximum portability were developed and implemented, including adherence to FORTRAN 77, and use of uniform system interface and auxiliary routines. A comprehensive test matrix was developed for each computer program to ensure that evolution of code versions preserves the licensing basis. In addition, the results of such test matrices establish the Quality Assurance basis and consistency for the same software operating on different computer platforms. (author). 4 figs

  20. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... special computer program processes this large volume of data to create two-dimensional cross-sectional images of ... Society of Urogenital Radiology note that the available data suggest that it is safe to continue breastfeeding ...

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... special computer program processes this large volume of data to create two-dimensional cross-sectional images of ... Society of Urogenital Radiology note that the available data suggest that it is safe to continue breastfeeding ...

  2. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  3. Information-theoretic temporal Bell inequality and quantum computation

    International Nuclear Information System (INIS)

    Morikoshi, Fumiaki

    2006-01-01

    An information-theoretic temporal Bell inequality is formulated to contrast classical and quantum computations. Any classical algorithm satisfies the inequality, while quantum ones can violate it. Therefore, the violation of the inequality is an immediate consequence of the quantumness in the computation. Furthermore, this approach suggests a notion of temporal nonlocality in quantum computation

  4. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  5. High fidelity thermal-hydraulic analysis using CFD and massively parallel computers

    International Nuclear Information System (INIS)

    Weber, D.P.; Wei, T.Y.C.; Brewster, R.A.; Rock, Daniel T.; Rizwan-uddin

    2000-01-01

    Thermal-hydraulic analyses play an important role in design and reload analysis of nuclear power plants. These analyses have historically relied on early generation computational fluid dynamics capabilities, originally developed in the 1960s and 1970s. Over the last twenty years, however, dramatic improvements in both computational fluid dynamics codes in the commercial sector and in computing power have taken place. These developments offer the possibility of performing large scale, high fidelity, core thermal hydraulics analysis. Such analyses will allow a determination of the conservatism employed in traditional design approaches and possibly justify the operation of nuclear power systems at higher powers without compromising safety margins. The objective of this work is to demonstrate such a large scale analysis approach using a state of the art CFD code, STAR-CD, and the computing power of massively parallel computers, provided by IBM. A high fidelity representation of a current generation PWR was analyzed with the STAR-CD CFD code and the results were compared to traditional analyses based on the VIPRE code. Current design methodology typically involves a simplified representation of the assemblies, where a single average pin is used in each assembly to determine the hot assembly from a whole core analysis. After determining this assembly, increased refinement is used in the hot assembly, and possibly some of its neighbors, to refine the analysis for purposes of calculating DNBR. This latter calculation is performed with sub-channel codes such as VIPRE. The modeling simplifications that are used involve the approximate treatment of surrounding assemblies and coarse representation of the hot assembly, where the subchannel is the lowest level of discretization. In the high fidelity analysis performed in this study, both restrictions have been removed. Within the hot assembly, several hundred thousand to several million computational zones have been used, to

  6. Classification of hadith into positive suggestion, negative suggestion, and information

    Science.gov (United States)

    Faraby, Said Al; Riviera Rachmawati Jasin, Eliza; Kusumaningrum, Andina; Adiwijaya

    2018-03-01

    As one of the Muslim life guidelines, based on the meaning of its sentence(s), a hadith can be viewed as a suggestion for doing something, or a suggestion for not doing something, or just information without any suggestion. In this paper, we tried to classify the Bahasa translation of hadith into the three categories using machine learning approach. We tried stemming and stopword removal in preprocessing, and TF-IDF of unigram, bigram, and trigram as the extracted features. As the classifier, we compared between SVM and Neural Network. Since the categories are new, so in order to compare the results of the previous pipelines, we created a baseline classifier using simple rule-based string matching technique. The rule-based algorithm conditions on the occurrence of words such as “janganlah, sholatlah, and so on” to determine the category. The baseline method achieved F1-Score of 0.69, while the best F1-Score from the machine learning approach was 0.88, and it was produced by SVM model with the linear kernel.

  7. Metagenome-based diversity analyses suggest a significant contribution of non-cyanobacterial lineages to carbonate precipitation in modern microbialites

    Directory of Open Access Journals (Sweden)

    Purificacion eLopez-Garcia

    2015-08-01

    Full Text Available Cyanobacteria are thought to play a key role in carbonate formation due to their metabolic activity, but other organisms carrying out oxygenic photosynthesis (photosynthetic eukaryotes or other metabolisms (e.g. anoxygenic photosynthesis, sulfate reduction, may also contribute to carbonate formation. To obtain more quantitative information than that provided by more classical PCR-dependent methods, we studied the microbial diversity of microbialites from the Alchichica crater lake (Mexico by mining for 16S/18S rRNA genes in metagenomes obtained by direct sequencing of environmental DNA. We studied samples collected at the Western (AL-W and Northern (AL-N shores of the lake and, at the latter site, along a depth gradient (1, 5, 10 and 15 m depth. The associated microbial communities were mainly composed of bacteria, most of which seemed heterotrophic, whereas archaea were negligible. Eukaryotes composed a relatively minor fraction dominated by photosynthetic lineages, diatoms in AL-W, influenced by Si-rich seepage waters, and green algae in AL-N samples. Members of the Gammaproteobacteria and Alphaproteobacteria classes of Proteobacteria, Cyanobacteria and Bacteroidetes were the most abundant bacterial taxa, followed by Planctomycetes, Deltaproteobacteria (Proteobacteria, Verrucomicrobia, Actinobacteria, Firmicutes and Chloroflexi. Community composition varied among sites and with depth. Although cyanobacteria were the most important bacterial group contributing to the carbonate precipitation potential, photosynthetic eukaryotes, anoxygenic photosynthesizers and sulfate reducers were also very abundant. Cyanobacteria affiliated to Pleurocapsales largely increased with depth. Scanning electron microscopy (SEM observations showed considerable areas of aragonite-encrusted Pleurocapsa-like cyanobacteria at microscale. Multivariate statistical analyses showed a strong positive correlation of Pleurocapsales and Chroococcales with aragonite formation at

  8. Functional Genomics and Phylogenetic Evidence Suggest Genus-Wide Cobalamin Production by the Globally Distributed Marine Nitrogen Fixer Trichodesmium.

    Science.gov (United States)

    Walworth, Nathan G; Lee, Michael D; Suffridge, Christopher; Qu, Pingping; Fu, Fei-Xue; Saito, Mak A; Webb, Eric A; Sañudo-Wilhelmy, Sergio A; Hutchins, David A

    2018-01-01

    Only select prokaryotes can biosynthesize vitamin B 12 (i.e., cobalamins), but these organic co-enzymes are required by all microbial life and can be vanishingly scarce across extensive ocean biomes. Although global ocean genome data suggest cyanobacteria to be a major euphotic source of cobalamins, recent studies have highlighted that >95% of cyanobacteria can only produce a cobalamin analog, pseudo-B 12 , due to the absence of the BluB protein that synthesizes the α ligand 5,6-dimethylbenzimidizole (DMB) required to biosynthesize cobalamins. Pseudo-B 12 is substantially less bioavailable to eukaryotic algae, as only certain taxa can intracellularly remodel it to one of the cobalamins. Here we present phylogenetic, metagenomic, transcriptomic, proteomic, and chemical analyses providing multiple lines of evidence that the nitrogen-fixing cyanobacterium Trichodesmium transcribes and translates the biosynthetic, cobalamin-requiring BluB enzyme. Phylogenetic evidence suggests that the Trichodesmium DMB biosynthesis gene, bluB , is of ancient origin, which could have aided in its ecological differentiation from other nitrogen-fixing cyanobacteria. Additionally, orthologue analyses reveal two genes encoding iron-dependent B 12 biosynthetic enzymes (cbiX and isiB), suggesting that iron availability may be linked not only to new nitrogen supplies from nitrogen fixation, but also to B 12 inputs by Trichodesmium . These analyses suggest that Trichodesmium contains the genus-wide genomic potential for a previously unrecognized role as a source of cobalamins, which may prove to considerably impact marine biogeochemical cycles.

  9. Finite element simulation of nanoindentation tests using a macroscopic computational model

    International Nuclear Information System (INIS)

    Khelifa, Mourad; Fierro, Vanessa; Celzard, Alain

    2014-01-01

    The aim of this work was to develop a numerical procedure to simulate nanoindentation tests using a macroscopic computational model. Both theoretical and numerical aspects of the proposed methodology, based on the coupling of isotropic elasticity and anisotropic plasticity described with the quadratic criterion of Hill are presented to model this behaviour. The anisotropic plastic behaviour accounts for the mixed nonlinear hardening (isotropic and kinematic) under large plastic deformation. Nanoindentation tests were simulated to analyse the nonlinear mechanical behaviour of aluminium alloy. The predicted results of the finite element (FE) modelling are in good agreement with the experimental data, thereby confirming the accuracy level of the suggested FE method of analysis. The effects of some technological and mechanical parameters known to have an influence during the nanoindentation tests were also investigated.

  10. Study and realization of a beam analyser of high intensity (10610)

    International Nuclear Information System (INIS)

    Perret-Gallix, D.

    1975-01-01

    A beam analyser working under high-beam intensity in the range of 10 6 to 10 10 particles per burst and giving position profile and intensity of this beam is studied. The reasons of this study, the principle of measurement, the construction of hardware and the different tests carried out on the chamber in order to evaluate the main features are related. The analyser is a multi-cellular ionisation chamber or stripe chamber; each cell made by a copper stripe (0.25mm wide) inserted between two high voltage planes (500V) forms a small independent ionisation chamber. This system, working under the on-line control of a mini-computer allows to associate to each event or event group the instantaneous position and profile of the beam [fr

  11. Parametric Analyses of Dynamic Characteristic of the Cable-Stayed Pedestrian Bridge

    Directory of Open Access Journals (Sweden)

    Pańtak Marek

    2017-12-01

    Full Text Available The paper presents characteristics of the structural system and results of dynamic field tests and numerical parametric analyses of three-span, two-pylon, cable-stayed pedestrian bridge with steel-concrete composite deck and spans of 25.5 + 60.0 + 25.5 m. The footbridge is characterized by increased dynamic susceptibility of the elements of the suspension system observed during the everyday operation of the structure. The analyses have shown that the high amplitude vibrations of the pylon back-stay cables change the parameters of the structural system and consequently change the value of the natural vibration frequencies of the structure. In the paper, the selection methodology of parameters of the computational model which allows to correctly determine the natural vibration frequencies of the footbridge has been presented.

  12. Integrative analyses of leprosy susceptibility genes indicate a common autoimmune profile.

    Science.gov (United States)

    Zhang, Deng-Feng; Wang, Dong; Li, Yu-Ye; Yao, Yong-Gang

    2016-04-01

    Leprosy is an ancient chronic infection in the skin and peripheral nerves caused by Mycobacterium leprae. The development of leprosy depends on genetic background and the immune status of the host. However, there is no systematic view focusing on the biological pathways, interaction networks and overall expression pattern of leprosy-related immune and genetic factors. To identify the hub genes in the center of leprosy genetic network and to provide an insight into immune and genetic factors contributing to leprosy. We retrieved all reported leprosy-related genes and performed integrative analyses covering gene expression profiling, pathway analysis, protein-protein interaction network, and evolutionary analyses. A list of 123 differentially expressed leprosy related genes, which were enriched in activation and regulation of immune response, was obtained in our analyses. Cross-disorder analysis showed that the list of leprosy susceptibility genes was largely shared by typical autoimmune diseases such as lupus erythematosus and arthritis, suggesting that similar pathways might be affected in leprosy and autoimmune diseases. Protein-protein interaction (PPI) and positive selection analyses revealed a co-evolution network of leprosy risk genes. Our analyses showed that leprosy associated genes constituted a co-evolution network and might undergo positive selection driven by M. leprae. We suggested that leprosy may be a kind of autoimmune disease and the development of leprosy is a matter of defect or over-activation of body immunity. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    Science.gov (United States)

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  14. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    Science.gov (United States)

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  15. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Computational thinking in life science education.

    Science.gov (United States)

    Rubinstein, Amir; Chor, Benny

    2014-11-01

    We join the increasing call to take computational education of life science students a step further, beyond teaching mere programming and employing existing software tools. We describe a new course, focusing on enriching the curriculum of life science students with abstract, algorithmic, and logical thinking, and exposing them to the computational "culture." The design, structure, and content of our course are influenced by recent efforts in this area, collaborations with life scientists, and our own instructional experience. Specifically, we suggest that an effective course of this nature should: (1) devote time to explicitly reflect upon computational thinking processes, resisting the temptation to drift to purely practical instruction, (2) focus on discrete notions, rather than on continuous ones, and (3) have basic programming as a prerequisite, so students need not be preoccupied with elementary programming issues. We strongly recommend that the mere use of existing bioinformatics tools and packages should not replace hands-on programming. Yet, we suggest that programming will mostly serve as a means to practice computational thinking processes. This paper deals with the challenges and considerations of such computational education for life science students. It also describes a concrete implementation of the course and encourages its use by others.

  17. Computational thinking in life science education.

    Directory of Open Access Journals (Sweden)

    Amir Rubinstein

    2014-11-01

    Full Text Available We join the increasing call to take computational education of life science students a step further, beyond teaching mere programming and employing existing software tools. We describe a new course, focusing on enriching the curriculum of life science students with abstract, algorithmic, and logical thinking, and exposing them to the computational "culture." The design, structure, and content of our course are influenced by recent efforts in this area, collaborations with life scientists, and our own instructional experience. Specifically, we suggest that an effective course of this nature should: (1 devote time to explicitly reflect upon computational thinking processes, resisting the temptation to drift to purely practical instruction, (2 focus on discrete notions, rather than on continuous ones, and (3 have basic programming as a prerequisite, so students need not be preoccupied with elementary programming issues. We strongly recommend that the mere use of existing bioinformatics tools and packages should not replace hands-on programming. Yet, we suggest that programming will mostly serve as a means to practice computational thinking processes. This paper deals with the challenges and considerations of such computational education for life science students. It also describes a concrete implementation of the course and encourages its use by others.

  18. Study of blast wave overpressures using the computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    M. L. COSTA NETO

    Full Text Available ABSTRACT The threats of bomb attacks by criminal organizations and accidental events involving chemical explosives are a danger to the people and buildings. Due the severity of these issues and the need of data required for a safety design, more research is required about explosions and shock waves. This paper presents an assessment of blast wave overpressures using a computational fluid dynamics software. Analyses of phenomena as reflection of shock waves and channeling effects were done and a comparison between numerical results and analytical predictions has been executed, based on the simulation on several models. The results suggest that the common analytical predictions aren’t accurate enough for an overpressure analysis in small stand-off distances and that poorly designed buildings may increase the shock wave overpressures due multiple blast wave reflections, increasing the destructive potential of the explosions.

  19. Early Versus Delayed Motion After Rotator Cuff Repair: A Systematic Review of Overlapping Meta-analyses.

    Science.gov (United States)

    Houck, Darby A; Kraeutler, Matthew J; Schuette, Hayden B; McCarty, Eric C; Bravman, Jonathan T

    2017-10-01

    Previous meta-analyses have been conducted to compare outcomes of early versus delayed motion after rotator cuff repair. To conduct a systematic review of overlapping meta-analyses comparing early versus delayed motion rehabilitation protocols after rotator cuff repair to determine which meta-analyses provide the best available evidence. Systematic review. A systematic review was performed by searching PubMed and Cochrane Library databases. Search terms included "rotator cuff repair," "early passive motion," "immobilization," "rehabilitation protocol," and "meta-analysis." Results were reviewed to determine study eligibility. Patient outcomes and structural healing were extracted from these meta-analyses. Meta-analysis quality was assessed using the Oxman-Guyatt and Quality of Reporting of Meta-analyses (QUOROM) systems. The Jadad decision algorithm was then used to determine which meta-analyses provided the best level of evidence. Seven meta-analyses containing a total of 5896 patients met the eligibility criteria (1 Level I evidence, 4 Level II evidence, 2 Level III evidence). None of these meta-analyses found immobilization to be superior to early motion; however, most studies suggested that early motion would increase range of motion (ROM), thereby reducing time of recovery. Three of these studies suggested that tear size contributed to the choice of rehabilitation to ensure proper healing of the shoulder. A study by Chan et al in 2014 received the highest QUOROM and Oxman-Guyatt scores, and therefore this meta-analysis appeared to have the highest level of evidence. Additionally, a study by Riboh and Garrigues in 2014 was selected as the highest quality study in this systematic review according to the Jadad decision algorithm. The current, best available evidence suggests that early motion improves ROM after rotator cuff repair but increases the risk of rotator cuff retear. Lower quality meta-analyses indicate that tear size may provide a better strategy in

  20. Secure Multiparty Computation for Cooperative Cyber Risk Assessment

    Science.gov (United States)

    2016-11-01

    that the organizations can compute relevant statistics and analyses on the global infrastructure while still keeping the details of their local...mitigation. In Australasian Conference on Information Security and Privacy, pages 391–401. Springer Berlin Heidelberg, 2004. [5] Fabrizio Smeraldi and Pasquale

  1. Computational Fluid Dynamic Analysis of the Left Atrial Appendage to Predict Thrombosis Risk

    Directory of Open Access Journals (Sweden)

    Giorgia Maria Bosi

    2018-04-01

    Full Text Available During Atrial Fibrillation (AF more than 90% of the left atrial thrombi responsible for thromboembolic events originate in the left atrial appendage (LAA, a complex small sac protruding from the left atrium (LA. Current available treatments to prevent thromboembolic events are oral anticoagulation, surgical LAA exclusion, or percutaneous LAA occlusion. However, the mechanism behind thrombus formation in the LAA is poorly understood. The aim of this work is to analyse the hemodynamic behaviour in four typical LAA morphologies - “Chicken wing”, “Cactus”, “Windsock” and “Cauliflower” - to identify potential relationships between the different shapes and the risk of thrombotic events. Computerised tomography (CT images from four patients with no LA pathology were segmented to derive the 3D anatomical shape of LAA and LA. Computational Fluid Dynamic (CFD analyses based on the patient-specific anatomies were carried out imposing both healthy and AF flow conditions. Velocity and shear strain rate (SSR were analysed for all cases. Residence time in the different LAA regions was estimated with a virtual contrast agent washing out. CFD results indicate that both velocity and SSR decrease along the LAA, from the ostium to the tip, at each instant in the cardiac cycle, thus making the LAA tip more prone to fluid stagnation, and therefore to thrombus formation. Velocity and SSR also decrease from normal to AF conditions. After four cardiac cycles, the lowest washout of contrast agent was observed for the Cauliflower morphology (3.27% of residual contrast in AF, and the highest for the Windsock (0.56% of residual contrast in AF. This suggests that the former is expected to be associated with a higher risk of thrombosis, in agreement with clinical reports in the literature. The presented computational models highlight the major role played by the LAA morphology on the hemodynamics, both in normal and AF conditions, revealing the potential

  2. Analysing power for neutron-proton scattering at 14.1 MeV

    International Nuclear Information System (INIS)

    Brock, J.E.; Chisholm, A.; Duder, J.C.; Garrett, R.; Poletti, J.L.

    1981-01-01

    The analysing power Asub(y)(theta) for neutron-proton scattering has been measured at 14.1 MeV for c.m. angles between 50 0 and 157 0 . A polarized neutron beam was produced by the reaction 3 H(d,n) 4 He at 110 keV, using polarized deuterons from an atomic beam polarized ion source. Liquid and plastic scintillators were used for proton targets and the scattered particles were detected in an array of platic scintillators. Use of the associated alpha technique, multi-parameter recording of events and off-line computer treatment led to very low backgrounds. The results differ significantly from the predictions of the phase-shift analyses of Yale IV, Livermore X and Arndt et al. We find, however, excellent agreement with the predictions of the Paris potential of Lacombe et al. Existing n-p analysing power results up to 30 MeV are surveyed and found to be consistent. An attempt was made to look for an isospin splitting of the triplet P-wave phase shifts. (orig.)

  3. CRUSH1: a simplified computer program for impact analysis of radioactive material transport casks

    Energy Technology Data Exchange (ETDEWEB)

    Ikushima, Takeshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-07-01

    In drop impact analyses for radioactive transport casks, it has become possible to perform them in detail by using interaction evaluation, computer programs, such as DYNA2D, DYNA3D, PISCES and HONDO. However, the considerable cost and computer time are necessitated to perform analyses by these programs. To meet the above requirements, a simplified computer program CRUSH1 has been developed. The CRUSH1 is a static calculation computer program capable of evaluating the maximum acceleration of cask bodies and the maximum deformation of shock absorbers using an Uniaxial Displacement Method (UDM). The CRUSH1 is a revised version of the CRUSH. Main revisions of the computer program are as follows; (1) not only main frame computer but also work stations (OS UNIX) and personal computer (OS Windows 3.1 or Windows NT) are available for use of the CRUSH1 and (2) input data set are revised. In the paper, brief illustration of calculation method using UDM is presented. The second section presents comparisons between UDM and the detailed method. The third section provides a use`s guide for CRUSH1. (author)

  4. CRUSH1: a simplified computer program for impact analysis of radioactive material transport casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1996-07-01

    In drop impact analyses for radioactive transport casks, it has become possible to perform them in detail by using interaction evaluation, computer programs, such as DYNA2D, DYNA3D, PISCES and HONDO. However, the considerable cost and computer time are necessitated to perform analyses by these programs. To meet the above requirements, a simplified computer program CRUSH1 has been developed. The CRUSH1 is a static calculation computer program capable of evaluating the maximum acceleration of cask bodies and the maximum deformation of shock absorbers using an Uniaxial Displacement Method (UDM). The CRUSH1 is a revised version of the CRUSH. Main revisions of the computer program are as follows; (1) not only main frame computer but also work stations (OS UNIX) and personal computer (OS Windows 3.1 or Windows NT) are available for use of the CRUSH1 and (2) input data set are revised. In the paper, brief illustration of calculation method using UDM is presented. The second section presents comparisons between UDM and the detailed method. The third section provides a use's guide for CRUSH1. (author)

  5. Cue reactivity and its inhibition in pathological computer game players.

    Science.gov (United States)

    Lorenz, Robert C; Krüger, Jenny-Kathinka; Neumann, Britta; Schott, Björn H; Kaufmann, Christian; Heinz, Andreas; Wüstenberg, Torsten

    2013-01-01

    Despite a rising social relevance of pathological computer game playing, it remains unclear whether the neurobiological basis of this addiction-like behavioral disorder and substance-related addiction are comparable. In substance-related addiction, attentional bias and cue reactivity are often observed. We conducted a functional magnetic resonance study using a dot probe paradigm with short-presentation (attentional bias) and long-presentation (cue reactivity) trials in eight male pathological computer game players (PCGPs) and nine healthy controls (HCs). Computer game-related and neutral computer-generated pictures, as well as pictures from the International Affective Picture System with positive and neutral valence, served as stimuli. PCGPs showed an attentional bias toward both game-related and affective stimuli with positive valence. In contrast, HCs showed no attentional bias effect at all. PCGPs showed stronger brain responses in short-presentation trials compared with HCs in medial prefrontal cortex (MPFC) and anterior cingulate gyrus and in long-presentation trials in lingual gyrus. In an exploratory post hoc functional connectivity analyses, for long-presentation trials, connectivity strength was higher between right inferior frontal gyrus, which was associated with inhibition processing in previous studies, and cue reactivity-related regions (left orbitofrontal cortex and ventral striatum) in PCGPs. We observed behavioral and neural effects in PCGPs, which are comparable with those found in substance-related addiction. However, cue-related brain responses were depending on duration of cue presentation. Together with the connectivity result, these findings suggest that top-down inhibitory processes might suppress the cue reactivity-related neural activity in long-presentation trials. © 2012 The Authors, Addiction Biology © 2012 Society for the Study of Addiction.

  6. Essential results of analyses accompanying the leak rate experiments E22 at HDR

    International Nuclear Information System (INIS)

    Grebner, H.; Hoefler, A.; Hunger, H.

    1994-01-01

    Under the E22 test group of phase III of the HDR safety programme, experiments were performed on the crack opening and leak rate behaviour of pipe components of smaller nominal bores. The experiments were complemented by computations, in particular verifications, to qualify the computation models as one of the main aims of the HDR safety programme. Most of the analyses to determine crack openings were performed by means of the finite-element method, including elastic-plastic materials behaviour and, complementarily, assessing engineering methods. The leak rate was calculated by means of separate 2-phase computation models. Altogether, it may be concluded from the structural and fracture mechanical experiments with pipes, elbows and branch pieces, that crack openings and incipient cracks at loading with internal pressure or bending moment can be described with good accuracy by means of the finite-element programme ADINA and the developed FE-models. (orig.) [de

  7. A weighted U statistic for association analyses considering genetic heterogeneity.

    Science.gov (United States)

    Wei, Changshuai; Elston, Robert C; Lu, Qing

    2016-07-20

    Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Comparison of two three-dimensional cephalometric analysis computer software

    OpenAIRE

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  9. Office Computers: Ergonomic Considerations.

    Science.gov (United States)

    Ganus, Susannah

    1984-01-01

    Each new report of the office automation market indicates technology is overrunning the office. The impacts of this technology are described and some ways to manage and physically "soften" the change to a computer-based office environment are suggested. (Author/MLW)

  10. Computational methods for high-energy source shielding

    International Nuclear Information System (INIS)

    Armstrong, T.W.; Cloth, P.; Filges, D.

    1983-01-01

    The computational methods for high-energy radiation transport related to shielding of the SNQ-spallation source are outlined. The basic approach is to couple radiation-transport computer codes which use Monte Carlo methods and discrete ordinates methods. A code system is suggested that incorporates state-of-the-art radiation-transport techniques. The stepwise verification of that system is briefly summarized. The complexity of the resulting code system suggests a more straightforward code specially tailored for thick shield calculations. A short guide line to future development of such a Monte Carlo code is given

  11. Computer applications in thermochemistry

    International Nuclear Information System (INIS)

    Vana Varamban, S.

    1996-01-01

    Knowledge of equilibrium is needed under many practical situations. Simple stoichiometric calculations can be performed by the use of hand calculators. Multi-component, multi-phase gas - solid chemical equilibrium calculations are far beyond the conventional devices and methods. Iterative techniques have to be resorted. Such problems are most elegantly handled by the use of modern computers. This report demonstrates the possible use of computers for chemical equilibrium calculations in the field of thermochemistry and chemical metallurgy. Four modules are explained. To fit the experimental C p data and to generate the thermal functions, to perform equilibrium calculations to the defined conditions, to prepare the elaborate input to the equilibrium and to analyse the calculated results graphically. The principles of thermochemical calculations are briefly described. An extensive input guide is given. Several illustrations are included to help the understanding and usage. (author)

  12. Post-facta Analyses of Fukushima Accident and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Fumiya [Sociotechnical Systems Safety Research Institute, Ichige (Japan)

    2014-08-15

    Independent analyses have been performed of the core melt behavior of the Unit 1, Unit 2 and Unit 3 reactors of Fukushima Daiichi Nuclear Power Station on 11-15 March 2011. The analyses are based on a phenomenological methodology with measured data investigation and a simple physical model calculation. Estimated are time variation of core water level, core material temperature and hydrogen generation rate. The analyses have revealed characteristics of accident process of each reactor. In the case of Unit 2 reactor, the calculated result suggests little hydrogen generation because of no steam generation in the core for zirconium-steam reaction during fuel damage process. It could be the reason of no hydrogen explosion in the Unit 2 reactor building. Analyses have been performed also on the core material behavior in another chaotic period of 19-31 March 2011, and it resulted in a re-melt hypothesis that core material in each reactor should have melted again due to shortage of cooling water. The hypothesis is consistent with many observed features of radioactive materials dispersion into the environment.

  13. Mechanisms of Neurofeedback: A Computation-theoretic Approach.

    Science.gov (United States)

    Davelaar, Eddy J

    2018-05-15

    Neurofeedback training is a form of brain training in which information about a neural measure is fed back to the trainee who is instructed to increase or decrease the value of that particular measure. This paper focuses on electroencephalography (EEG) neurofeedback in which the neural measures of interest are the brain oscillations. To date, the neural mechanisms that underlie successful neurofeedback training are still unexplained. Such an understanding would benefit researchers, funding agencies, clinicians, regulatory bodies, and insurance firms. Based on recent empirical work, an emerging theory couched firmly within computational neuroscience is proposed that advocates a critical role of the striatum in modulating EEG frequencies. The theory is implemented as a computer simulation of peak alpha upregulation, but in principle any frequency band at one or more electrode sites could be addressed. The simulation successfully learns to increase its peak alpha frequency and demonstrates the influence of threshold setting - the threshold that determines whether positive or negative feedback is provided. Analyses of the model suggest that neurofeedback can be likened to a search process that uses importance sampling to estimate the posterior probability distribution over striatal representational space, with each representation being associated with a distribution of values of the target EEG band. The model provides an important proof of concept to address pertinent methodological questions about how to understand and improve EEG neurofeedback success. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  14. Will the digital computer transform classical mathematics?

    Science.gov (United States)

    Rotman, Brian

    2003-08-15

    Mathematics and machines have influenced each other for millennia. The advent of the digital computer introduced a powerfully new element that promises to transform the relation between them. This paper outlines the thesis that the effect of the digital computer on mathematics, already widespread, is likely to be radical and far-reaching. To articulate this claim, an abstract model of doing mathematics is introduced based on a triad of actors of which one, the 'agent', corresponds to the function performed by the computer. The model is used to frame two sorts of transformation. The first is pragmatic and involves the alterations and progressive colonization of the content and methods of enquiry of various mathematical fields brought about by digital methods. The second is conceptual and concerns a fundamental antagonism between the infinity enshrined in classical mathematics and physics (continuity, real numbers, asymptotic definitions) and the inherently real and material limit of processes associated with digital computation. An example which lies in the intersection of classical mathematics and computer science, the P=NP problem, is analysed in the light of this latter issue.

  15. Extending Landauer's bound from bit erasure to arbitrary computation

    Science.gov (United States)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No

  16. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    International Nuclear Information System (INIS)

    Lee, Gi Hwa

    1997-11-01

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances

  17. Computer applications in water conservancy and hydropower engineering

    Energy Technology Data Exchange (ETDEWEB)

    Chen, J

    1984-09-20

    The use of computers in China's water conservancy and hydropower construction began in the 1960s for exploration surveys, planning, design, construction, operation, and scientific research. Despite the positive results, and the formation of a 1000-person computer computation contingent, computer development among different professions is not balanced. The weaknesses and disparities in computer applications include an overall low level of application relative to the rest of the world, which is partly due to inadequate hardware and programs. The report suggests five ways to improve applications and popularize microcomputers which emphasize leadership and planning.

  18. Preventing smoking relapse via Web-based computer-tailored feedback: a randomized controlled trial.

    Science.gov (United States)

    Elfeddali, Iman; Bolman, Catherine; Candel, Math J J M; Wiers, Reinout W; de Vries, Hein

    2012-08-20

    Web-based computer-tailored approaches have the potential to be successful in supporting smoking cessation. However, the potential effects of such approaches for relapse prevention and the value of incorporating action planning strategies to effectively prevent smoking relapse have not been fully explored. The Stay Quit for You (SQ4U) study compared two Web-based computer-tailored smoking relapse prevention programs with different types of planning strategies versus a control group. To assess the efficacy of two Web-based computer-tailored programs in preventing smoking relapse compared with a control group. The action planning (AP) program provided tailored feedback at baseline and invited respondents to do 6 preparatory and coping planning assignments (the first 3 assignments prior to quit date and the final 3 assignments after quit date). The action planning plus (AP+) program was an extended version of the AP program that also provided tailored feedback at 11 time points after the quit attempt. Respondents in the control group only filled out questionnaires. The study also assessed possible dose-response relationships between abstinence and adherence to the programs. The study was a randomized controlled trial with three conditions: the control group, the AP program, and the AP+ program. Respondents were daily smokers (N = 2031), aged 18 to 65 years, who were motivated and willing to quit smoking within 1 month. The primary outcome was self-reported continued abstinence 12 months after baseline. Logistic regression analyses were conducted using three samples: (1) all respondents as randomly assigned, (2) a modified sample that excluded respondents who did not make a quit attempt in conformance with the program protocol, and (3) a minimum dose sample that also excluded respondents who did not adhere to at least one of the intervention elements. Observed case analyses and conservative analyses were conducted. In the observed case analysis of the randomized sample

  19. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  20. The Impact of Secondary History Teachers' Teaching Conceptions on the Classroom Use of Computers

    Science.gov (United States)

    Arancibia Herrera, Marcelo; Badia Garganté, Antoni; Soto Caro, Carmen Paz; Sigerson, Andrew Lee

    2018-01-01

    During the past 15 years, various studies have described factors affecting the use of computers in the classroom. In analysing factors of influence, many studies have focused on technology-related variables such as computer experience or attitudes toward computers, and others have considered teachers' beliefs as well; most of them have studied…

  1. Reproduction of the Yucca Mountain Project TSPA-LA Uncertainty and Sensitivity Analyses and Preliminary Upgrade of Models

    Energy Technology Data Exchange (ETDEWEB)

    Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis; Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis

    2016-09-01

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.

  2. The use of bootstrap methods for analysing health-related quality of life outcomes (particularly the SF-36

    Directory of Open Access Journals (Sweden)

    Campbell Michael J

    2004-12-01

    Full Text Available Abstract Health-Related Quality of Life (HRQoL measures are becoming increasingly used in clinical trials as primary outcome measures. Investigators are now asking statisticians for advice on how to analyse studies that have used HRQoL outcomes. HRQoL outcomes, like the SF-36, are usually measured on an ordinal scale. However, most investigators assume that there exists an underlying continuous latent variable that measures HRQoL, and that the actual measured outcomes (the ordered categories, reflect contiguous intervals along this continuum. The ordinal scaling of HRQoL measures means they tend to generate data that have discrete, bounded and skewed distributions. Thus, standard methods of analysis such as the t-test and linear regression that assume Normality and constant variance may not be appropriate. For this reason, conventional statistical advice would suggest that non-parametric methods be used to analyse HRQoL data. The bootstrap is one such computer intensive non-parametric method for analysing data. We used the bootstrap for hypothesis testing and the estimation of standard errors and confidence intervals for parameters, in four datasets (which illustrate the different aspects of study design. We then compared and contrasted the bootstrap with standard methods of analysing HRQoL outcomes. The standard methods included t-tests, linear regression, summary measures and General Linear Models. Overall, in the datasets we studied, using the SF-36 outcome, bootstrap methods produce results similar to conventional statistical methods. This is likely because the t-test and linear regression are robust to the violations of assumptions that HRQoL data are likely to cause (i.e. non-Normality. While particular to our datasets, these findings are likely to generalise to other HRQoL outcomes, which have discrete, bounded and skewed distributions. Future research with other HRQoL outcome measures, interventions and populations, is required to

  3. Computational Biomechanics Theoretical Background and BiologicalBiomedical Problems

    CERN Document Server

    Tanaka, Masao; Nakamura, Masanori

    2012-01-01

    Rapid developments have taken place in biological/biomedical measurement and imaging technologies as well as in computer analysis and information technologies. The increase in data obtained with such technologies invites the reader into a virtual world that represents realistic biological tissue or organ structures in digital form and allows for simulation and what is called “in silico medicine.” This volume is the third in a textbook series and covers both the basics of continuum mechanics of biosolids and biofluids and the theoretical core of computational methods for continuum mechanics analyses. Several biomechanics problems are provided for better understanding of computational modeling and analysis. Topics include the mechanics of solid and fluid bodies, fundamental characteristics of biosolids and biofluids, computational methods in biomechanics analysis/simulation, practical problems in orthopedic biomechanics, dental biomechanics, ophthalmic biomechanics, cardiovascular biomechanics, hemodynamics...

  4. Computational compliance criteria in water hammer modelling

    Science.gov (United States)

    Urbanowicz, Kamil

    2017-10-01

    Among many numerical methods (finite: difference, element, volume etc.) used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC) is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC), which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL) number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  5. Indication for dental computed tomography. Case reports

    International Nuclear Information System (INIS)

    Schom, C.; Engelke, W.; Kopka, L.; Fischer, U.; Grabbe, E.

    1996-01-01

    Based on case reports, common indications for dental computed tomography are demonstrated and typical findings are analysed. From a group of 110 patients who had a reformatted computed tomography of the maxilla and mandibula, 10 typical cases were chosen as examples and are presented with a detailed description of the findings. The most important indication was the analysis of the morphology of the alveolar ridge needed in presurgical planning for osseointegrated implants as well as in special cases of postsurgical control. Apart from implantology, the method could be used in cases of mandibular cysts and bony destructions. In conclusion, dental computed tomography has become established mainly in implantology. It can provide valuable results in cases where a demonstration of the bone in all dimensions and free of overlappings and distortions is needed. (orig.) [de

  6. Combining computational analyses and interactive visualization for document exploration and sensemaking in jigsaw.

    Science.gov (United States)

    Görg, Carsten; Liu, Zhicheng; Kihm, Jaeyeon; Choo, Jaegul; Park, Haesun; Stasko, John

    2013-10-01

    Investigators across many disciplines and organizations must sift through large collections of text documents to understand and piece together information. Whether they are fighting crime, curing diseases, deciding what car to buy, or researching a new field, inevitably investigators will encounter text documents. Taking a visual analytics approach, we integrate multiple text analysis algorithms with a suite of interactive visualizations to provide a flexible and powerful environment that allows analysts to explore collections of documents while sensemaking. Our particular focus is on the process of integrating automated analyses with interactive visualizations in a smooth and fluid manner. We illustrate this integration through two example scenarios: an academic researcher examining InfoVis and VAST conference papers and a consumer exploring car reviews while pondering a purchase decision. Finally, we provide lessons learned toward the design and implementation of visual analytics systems for document exploration and understanding.

  7. Behavior of underclad cracks in reactor pressure vessels - evaluation of mechanical analyses with tests on cladded mock-ups

    International Nuclear Information System (INIS)

    Moinereau, D.; Rousselier, G.; Bethmont, M.

    1993-01-01

    Innocuity of underclad flaws in the reactor pressure vessels must be demonstrated in the French safety analyses, particularly in the case of a severe transient at the end of the pressure vessel lifetime, because of the radiation embrittlement of the vessel material. Safety analyses are usually performed with elastic and elasto-plastic analyses taking into account the effect of the stainless steel cladding. EDF has started a program including experiments on large size cladded specimens and their interpretations. The purpose of this program is to evaluate the different methods of fracture analysis used in safety studies. Several specimens made of ferritic steel A508 C1 3 with stainless steel cladding, containing small artificial defects, are loaded in four-point bending. Experiments are performed at very low temperature to simulate radiation embrittlement and to obtain crack instability by cleavage fracture. Three tests have been performed on mock-ups containing a small underclad crack (with depth about 5 mn) and a fourth test has been performed on one mock-up with a larger crack (depth about 13 mn). In each case, crack instability occurred by cleavage fracture in the base metal, without crack arrest, at a temperature of about - 170 deg C. Each test is interpreted using linear elastic analysis and elastic-plastic analysis by two-dimensional finite element computations. The fracture are conservatively predicted: the stress intensity factors deduced from the computations (K cp or K j ) are always greater than the base metal toughness. The comparison between the elastic analyses (including two plasticity corrections) and the elastic-plastic analyses shows that the elastic analyses are often conservative. The beneficial effect of the cladding in the analyses is also shown : the analyses are too conservative if the cladding effects is not taken into account. (authors). 9 figs., 6 tabs., 10 refs

  8. Architecture, systems research and computational sciences

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 1) issue of the Nexus Network Journal is dedicated to the theme “Architecture, Systems Research and Computational Sciences”. This is an outgrowth of the session by the same name which took place during the eighth international, interdisciplinary conference “Nexus 2010: Relationships between Architecture and Mathematics, held in Porto, Portugal, in June 2010. Today computer science is an integral part of even strictly historical investigations, such as those concerning the construction of vaults, where the computer is used to survey the existing building, analyse the data and draw the ideal solution. What the papers in this issue make especially evident is that information technology has had an impact at a much deeper level as well: architecture itself can now be considered as a manifestation of information and as a complex system. The issue is completed with other research papers, conference reports and book reviews.

  9. Proteomic analyses of male contributions to honey bee sperm storage and mating

    OpenAIRE

    Collins, A M; Caperna, T J; Williams, V; Garrett, W M; Evans, J D

    2006-01-01

    Honey bee (Apis mellifera L.) queens mate early in life and store sperm for years. Male bees likely contribute significantly to sperm survival. Proteins were extracted from seminal vesicles and semen of mature drones, separated by electrophoresis, and analysed by peptide mass fingerprinting. Computer searches against three databases, general species, honey bees and fruit flies, were performed. Spectra were used to query the recently generated honey bee genome protein list as well as general s...

  10. Application of energy dispersive X-ray spectrometers with semiconductor detectors in radiometric analyses

    International Nuclear Information System (INIS)

    Jugelt, P.; Schieckel, M.

    1983-01-01

    Problems and possibilities of applying semiconductor detector spectrometers in radiometric analyses are described. A summary of the state of the art and tendencies of device engineering and spectra evaluation is given. Liquid-nitrogen cooled Li-drifted Si-detectors and high-purity Ge-detectors are compared. Semiconductor detectors working at room temperature are under development. In this connection CdTe and HgI 2 semiconductor detectors are compared. The use of small efficient computers in the spectrometer systems stimulates the development of algorithms for spectra analyses and for determining the concentration. Fields of application of energy dispersive X-ray spectrometers are X-ray diffraction and X-ray macroanalysis in investigating the structure of extensive surface regions

  11. Sensitivity analyses of biodiesel thermo-physical properties under diesel engine conditions

    DEFF Research Database (Denmark)

    Cheng, Xinwei; Ng, Hoon Kiat; Gan, Suyin

    2016-01-01

    This reported work investigates the sensitivities of spray and soot developments to the change of thermo-physical properties for coconut and soybean methyl esters, using two-dimensional computational fluid dynamics fuel spray modelling. The choice of test fuels made was due to their contrasting...... saturation-unsaturation compositions. The sensitivity analyses for non-reacting and reacting sprays were carried out against a total of 12 thermo-physical properties, at an ambient temperature of 900 K and density of 22.8 kg/m3. For the sensitivity analyses, all the thermo-physical properties were set...... as the baseline case and each property was individually replaced by that of diesel. The significance of individual thermo-physical property was determined based on the deviations found in predictions such as liquid penetration, ignition delay period and peak soot concentration when compared to those of baseline...

  12. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  13. German contributions to the CMS computing infrastructure

    International Nuclear Information System (INIS)

    Scheurer, A

    2010-01-01

    The CMS computing model anticipates various hierarchically linked tier centres to counter the challenges provided by the enormous amounts of data which will be collected by the CMS detector at the Large Hadron Collider, LHC, at CERN. During the past years, various computing exercises were performed to test the readiness of the computing infrastructure, the Grid middleware and the experiment's software for the startup of the LHC which took place in September 2008. In Germany, several tier sites are set up to allow for an efficient and reliable way to simulate possible physics processes as well as to reprocess, analyse and interpret the numerous stored collision events of the experiment. It will be shown that the German computing sites played an important role during the experiment's preparation phase and during data-taking of CMS and, therefore, scientific groups in Germany will be ready to compete for discoveries in this new era of particle physics. This presentation focuses on the German Tier-1 centre GridKa, located at Forschungszentrum Karlsruhe, the German CMS Tier-2 federation DESY/RWTH with installations at the University of Aachen and the research centre DESY. In addition, various local computing resources in Aachen, Hamburg and Karlsruhe are briefly introduced as well. It will be shown that an excellent cooperation between the different German institutions and physicists led to well established computing sites which cover all parts of the CMS computing model. Therefore, the following topics are discussed and the achieved goals and the gained knowledge are depicted: data management and distribution among the different tier sites, Grid-based Monte Carlo production at the Tier-2 as well as Grid-based and locally submitted inhomogeneous user analyses at the Tier-3s. Another important task is to ensure a proper and reliable operation 24 hours a day, especially during the time of data-taking. For this purpose, the meta-monitoring tool 'HappyFace', which was

  14. Computed tomography angiography and perfusion to assess coronary artery stenosis causing perfusion defects by single photon emission computed tomography

    DEFF Research Database (Denmark)

    Rochitte, Carlos E; George, Richard T; Chen, Marcus Y

    2014-01-01

    AIMS: To evaluate the diagnostic power of integrating the results of computed tomography angiography (CTA) and CT myocardial perfusion (CTP) to identify coronary artery disease (CAD) defined as a flow limiting coronary artery stenosis causing a perfusion defect by single photon emission computed...... emission computed tomography (SPECT/MPI). Sixteen centres enroled 381 patients who underwent combined CTA-CTP and SPECT/MPI prior to conventional coronary angiography. All four image modalities were analysed in blinded independent core laboratories. The prevalence of obstructive CAD defined by combined ICA...... tomography (SPECT). METHODS AND RESULTS: We conducted a multicentre study to evaluate the accuracy of integrated CTA-CTP for the identification of patients with flow-limiting CAD defined by ≥50% stenosis by invasive coronary angiography (ICA) with a corresponding perfusion deficit on stress single photon...

  15. Computational Modeling of Oxygen Transport in the Microcirculation: From an Experiment-Based Model to Theoretical Analyses

    OpenAIRE

    Lücker, Adrien

    2017-01-01

    Oxygen supply to cells by the cardiovascular system involves multiple physical and chemical processes that aim to satisfy fluctuating metabolic demand. Regulation mechanisms range from increased heart rate to minute adaptations in the microvasculature. The challenges and limitations of experimental studies in vivo make computational models an invaluable complement. In this thesis, oxygen transport from capillaries to tissue is investigated using a new numerical model that is tailored for vali...

  16. Computing for Heavy Ion Physics

    International Nuclear Information System (INIS)

    Martinez, G.; Schiff, D.; Hristov, P.; Menaud, J.M.; Hrivnacova, I.; Poizat, P.; Chabratova, G.; Albin-Amiot, H.; Carminati, F.; Peters, A.; Schutz, Y.; Safarik, K.; Ollitrault, J.Y.; Hrivnacova, I.; Morsch, A.; Gheata, A.; Morsch, A.; Vande Vyvre, P.; Lauret, J.; Nief, J.Y.; Pereira, H.; Kaczmarek, O.; Conesa Del Valle, Z.; Guernane, R.; Stocco, D.; Gruwe, M.; Betev, L.; Baldisseri, A.; Vilakazi, Z.; Rapp, B.; Masoni, A.; Stoicea, G.; Brun, R.

    2005-01-01

    This workshop was devoted to the computational technologies needed for the heavy quarkonia and open flavor production study at LHC (large hadron collider) experiments. These requirements are huge: peta-bytes of data will be generated each year. Analysing this will require the equivalent of a few thousands of today's fastest PC processors. The new developments in terms of dedicated software has been addressed. This document gathers the transparencies that were presented at the workshop

  17. Computing for Heavy Ion Physics

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, G; Schiff, D; Hristov, P; Menaud, J M; Hrivnacova, I; Poizat, P; Chabratova, G; Albin-Amiot, H; Carminati, F; Peters, A; Schutz, Y; Safarik, K; Ollitrault, J Y; Hrivnacova, I; Morsch, A; Gheata, A; Morsch, A; Vande Vyvre, P; Lauret, J; Nief, J Y; Pereira, H; Kaczmarek, O; Conesa Del Valle, Z; Guernane, R; Stocco, D; Gruwe, M; Betev, L; Baldisseri, A; Vilakazi, Z; Rapp, B; Masoni, A; Stoicea, G; Brun, R

    2005-07-01

    This workshop was devoted to the computational technologies needed for the heavy quarkonia and open flavor production study at LHC (large hadron collider) experiments. These requirements are huge: peta-bytes of data will be generated each year. Analysing this will require the equivalent of a few thousands of today's fastest PC processors. The new developments in terms of dedicated software has been addressed. This document gathers the transparencies that were presented at the workshop.

  18. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    Science.gov (United States)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  19. Parallel computing in genomic research: advances and applications

    Directory of Open Access Journals (Sweden)

    Ocaña K

    2015-11-01

    Full Text Available Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. Keywords: high-performance computing, genomic research, cloud computing, grid computing, cluster computing, parallel computing

  20. Project W-320 SAR and process control thermal analyses

    International Nuclear Information System (INIS)

    Sathyanarayana, K.

    1997-01-01

    This report summarizes the results of thermal hydraulic computer modeling supporting Project W-320 for process control and SAR documentation. Parametric analyses were performed for the maximum steady state waste temperature. The parameters included heat load distribution, tank heat load, fluffing factor and thermal conductivity. Uncertainties in the fluffing factor and heat load distribution had the largest effect on maximum waste temperature. Safety analyses were performed for off normal events including loss of ventilation, loss of evaporation and loss of secondary chiller. The loss of both the primary and secondary ventilation was found to be the most limiting event with saturation temperature in the bottom waste reaching in just over 30 days. An evaluation was performed for the potential lowering of the supernatant level in tank 241-AY-102. The evaluation included a loss of ventilation and steam bump analysis. The reduced supernatant level decreased the time to reach saturation temperature in the waste for the loss of ventilation by about one week. However, the consequence of a steam bump were dramatically reduced

  1. Computational analyses of spectral trees from electrospray multi-stage mass spectrometry to aid metabolite identification.

    Science.gov (United States)

    Cao, Mingshu; Fraser, Karl; Rasmussen, Susanne

    2013-10-31

    Mass spectrometry coupled with chromatography has become the major technical platform in metabolomics. Aided by peak detection algorithms, the detected signals are characterized by mass-over-charge ratio (m/z) and retention time. Chemical identities often remain elusive for the majority of the signals. Multi-stage mass spectrometry based on electrospray ionization (ESI) allows collision-induced dissociation (CID) fragmentation of selected precursor ions. These fragment ions can assist in structural inference for metabolites of low molecular weight. Computational investigations of fragmentation spectra have increasingly received attention in metabolomics and various public databases house such data. We have developed an R package "iontree" that can capture, store and analyze MS2 and MS3 mass spectral data from high throughput metabolomics experiments. The package includes functions for ion tree construction, an algorithm (distMS2) for MS2 spectral comparison, and tools for building platform-independent ion tree (MS2/MS3) libraries. We have demonstrated the utilization of the package for the systematic analysis and annotation of fragmentation spectra collected in various metabolomics platforms, including direct infusion mass spectrometry, and liquid chromatography coupled with either low resolution or high resolution mass spectrometry. Assisted by the developed computational tools, we have demonstrated that spectral trees can provide informative evidence complementary to retention time and accurate mass to aid with annotating unknown peaks. These experimental spectral trees once subjected to a quality control process, can be used for querying public MS2 databases or de novo interpretation. The putatively annotated spectral trees can be readily incorporated into reference libraries for routine identification of metabolites.

  2. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  3. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  4. The effect of posthypnotic suggestion, hypnotic suggestibility, and goal intentions on adherence to medical instructions.

    Science.gov (United States)

    Carvalho, Claudia; Mazzoni, Giuliana; Kirsch, Irving; Meo, Maria; Santandrea, Maura

    2008-04-01

    The effects of implementation intentions and posthypnotic suggestion were investigated in 2 studies. In Experiment 1, participants with high levels of hypnotic suggestibility were instructed to take placebo pills as part of an investigation of how to best enhance compliance with medical instruction. In Experiment 2, participants with high, medium, and low levels of hypnotic suggestibility were asked to run in place, take their pulse rate before, and send an e-mail report to the experimenter each day. Experiment 1 revealed enhanced adherence as a function of both implementation intentions and posthypnotic suggestion. Experiment 2 failed to find any significant main effects but found a significant interaction between suggestibility and the effects of posthypnotic suggestion. Posthypnotic suggestion enhanced adherence among high suggestible participants but lowered it among low suggestibles.

  5. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  6. Computer use changes generalization of movement learning.

    Science.gov (United States)

    Wei, Kunlin; Yan, Xiang; Kong, Gaiqing; Yin, Cong; Zhang, Fan; Wang, Qining; Kording, Konrad Paul

    2014-01-06

    Over the past few decades, one of the most salient lifestyle changes for us has been the use of computers. For many of us, manual interaction with a computer occupies a large portion of our working time. Through neural plasticity, this extensive movement training should change our representation of movements (e.g., [1-3]), just like search engines affect memory [4]. However, how computer use affects motor learning is largely understudied. Additionally, as virtually all participants in studies of perception and actions are computer users, a legitimate question is whether insights from these studies bear the signature of computer-use experience. We compared non-computer users with age- and education-matched computer users in standard motor learning experiments. We found that people learned equally fast but that non-computer users generalized significantly less across space, a difference negated by two weeks of intensive computer training. Our findings suggest that computer-use experience shaped our basic sensorimotor behaviors, and this influence should be considered whenever computer users are recruited as study participants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Development and verification of an efficient spatial neutron kinetics method for reactivity-initiated event analyses

    International Nuclear Information System (INIS)

    Ikeda, Hideaki; Takeda, Toshikazu

    2001-01-01

    A space/time nodal diffusion code based on the nodal expansion method (NEM), EPISODE, was developed in order to evaluate transient neutron behavior in light water reactor cores. The present code employs the improved quasistatic (IQS) method for spatial neutron kinetics, and neutron flux distribution is numerically obtained by solving the neutron diffusion equation with the nonlinear iteration scheme to achieve fast computation. A predictor-corrector (PC) method developed in the present study enabled to apply a coarse time mesh to the transient spatial neutron calculation than that applicable in the conventional IQS model, which improved computational efficiency further. Its computational advantage was demonstrated by applying to the numerical benchmark problems that simulate reactivity-initiated events, showing reduction of computational times up to a factor of three than the conventional IQS. The thermohydraulics model was also incorporated in EPISODE, and the capability of realistic reactivity event analyses was verified using the SPERT-III/E-Core experimental data. (author)

  8. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  9. A two-channel wave analyser for sounding rockets and satellites

    International Nuclear Information System (INIS)

    Brondz, E.

    1989-04-01

    Studies of low frequency electromagnetic waves, produced originally by lightning discharges penetrating the ionosphere, provide an important source of valuable information about the earth's surrounding plasma. Use of rockets and satellites supported by ground-based observations implies, unique opportunity for measuring in situ a number of parameters simultaneously in order to correlate data from various measurements. However, every rocket experiment has to be designed bearing in mind telemetry limitations and/or short flight duration. Typical flight duration for Norwegian rockets launched from Andoeya Rocket Range is 500 to 600 s. Therefore, the most desired way to use a rocket or satellite is to carry out data analyses on board in real time. Recent achievements in Digital Signal Processing (DSP) technology have made it possible to undertake very complex on board data manipulation. As a part of rocket instrumentation, a DSP based unit able to carry out on board analyses of low frequency electromagnetic waves in the ionosphere has been designed. The unit can be seen as a general purpose computer built on the basis of a fixed-point 16 bit signal processor. The unit is supplied with a program code in order to perform wave analyses on two independent channels simultaneously. The analyser is able to perform 256 point complex fast fourier transformations, and it produce a spectral power desity estimate on both channels every 85 ms. The design and construction of the DSP based unit is described and results from the tests are presented

  10. Analyses, algorithms, and computations for models of high-temperature superconductivity. Final technical report

    International Nuclear Information System (INIS)

    Gunzburger, M.D.; Peterson, J.S.

    1998-01-01

    Under the sponsorship of the Department of Energy, the authors have achieved significant progress in the modeling, analysis, and computation of superconducting phenomena. Their work has focused on mezoscale models as typified by the celebrated ginzburg-Landau equations; these models are intermediate between the microscopic models (that can be used to understand the basic structure of superconductors and of the atomic and sub-atomic behavior of these materials) and the macroscale, or homogenized, models (that can be of use for the design of devices). The models the authors have considered include a time dependent Ginzburg-Landau model, a variable thickness thin film model, models for high values of the Ginzburg-Landau parameter, models that account for normal inclusions and fluctuations and Josephson effects, and the anisotropic Ginzburg-Landau and Lawrence-Doniach models for layered superconductors, including those with high critical temperatures. In each case, they have developed or refined the models, derived rigorous mathematical results that enhance the state of understanding of the models and their solutions, and developed, analyzed, and implemented finite element algorithms for the approximate solution of the model equations

  11. Regulatory considerations for computational requirements for nuclear criticality safety

    International Nuclear Information System (INIS)

    Bidinger, G.H.

    1995-01-01

    As part of its safety mission, the U.S. Nuclear Regulatory Commission (NRC) approves the use of computational methods as part of the demonstration of nuclear criticality safety. While each NRC office has different criteria for accepting computational methods for nuclear criticality safety results, the Office of Nuclear Materials Safety and Safeguards (NMSS) approves the use of specific computational methods and methodologies for nuclear criticality safety analyses by specific companies (licensees or consultants). By contrast, the Office of Nuclear Reactor Regulation approves codes for general use. Historically, computational methods progressed from empirical methods to one-dimensional diffusion and discrete ordinates transport calculations and then to three-dimensional Monte Carlo transport calculations. With the advent of faster computational ability, three-dimensional diffusion and discrete ordinates transport calculations are gaining favor. With the proper user controls, NMSS has accepted any and all of these methods for demonstrations of nuclear criticality safety

  12. Primary graft dysfunction; possible evaluation by high resolution computed tomography, and suggestions for a scoring system

    DEFF Research Database (Denmark)

    Belmaati, Esther; Jensen, Claus; Kofoed, Klaus F

    2009-01-01

    /exclusion criteria of patients, pilot testing, and training investigators through review of disagreements, were possibilities suggested for decreasing inter/intra observer variability. Factors affecting the image attenuation (Hounsfield numbers) and thus, the reproducibility of CT densitometric measurements were...... of parenchymal change in the lung. HRCT is considered relevant and superior in evaluating disease severity, disease progression, and in evaluating the effects of therapy regimes in the lung. It is, however, not clear to what extent these scoring methods may be implemented for grading PGD. Further efforts could...

  13. Suggestibility under Pressure: Theory of Mind, Executive Function, and Suggestibility in Preschoolers

    Science.gov (United States)

    Karpinski, Aryn C.; Scullin, Matthew H.

    2009-01-01

    Eighty preschoolers, ages 3 to 5 years old, completed a 4-phase study in which they experienced a live event and received a pressured, suggestive interview about the event a week later. Children were also administered batteries of theory of mind and executive function tasks, as well as the Video Suggestibility Scale for Children (VSSC), which…

  14. CMS on the GRID: Toward a fully distributed computing architecture

    International Nuclear Information System (INIS)

    Innocente, Vincenzo

    2003-01-01

    The computing systems required to collect, analyse and store the physics data at LHC would need to be distributed and global in scope. CMS is actively involved in several grid-related projects to develop and deploy a fully distributed computing architecture. We present here recent developments of tools for automating job submission and for serving data to remote analysis stations. Plans for further test and deployment of a production grid are also described

  15. Computer-supported quality control in X-ray diagnosis

    International Nuclear Information System (INIS)

    Maier, W.; Klotz, E.

    1989-01-01

    Quality control of X-ray facilities in radiological departments of large hospitals is possible only if the instrumentation used for measurements is interfaced to a computer. The central computer helps to organize the measurements as well as analyse and record the results. It can also be connected to a densitometer and camera for evaluating radiographs of test devices. Other quality control tests are supported by a mobile station with equipment for non-invasive dosimetry measurements. Experience with a computer-supported system in quality control of film and film processing is described and the evaluation methods of ANSI and the German industrial standard DIN are compared. The disadvantage of these methods is the exclusion of film quality parameters, which can make processing control almost worthless. (author)

  16. Computing on the grid and in the cloud

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    "The results today are only possible because of the extraordinary performance of the accelerators, including the infrastructure, the experiments, and the Grid computing." These were the words of the CERN Director General Rolf Heuer when the observation of a new particle consistent with a Higgs Boson was revealed to the world on the 4th July 2012. The end result of the all investments made to build and operate the LHC is the data that are recorded and the knowledge that can be extracted. It is the role of the global computing infrastructure to unlock the value that is encapsulated in the data. This lecture provides a detailed overview of the Worldwide LHC Computing Grid, an international collaboration to distribute and analyse the LHC data.

  17. Computing on the grid and in the cloud

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    "The results today are only possible because of the extraordinary performance of the accelerators, including the infrastructure, the experiments, and the Grid computing." These were the words of the CERN Director General Rolf Heuer when the observation of a new particle consistent with a Higgs Boson was revealed to the world on the 4th July 2012. The end result of the all investments made to build and operate the LHC is the data that are recorded and the knowledge that can be extracted. It is the role of the global computing infrastructure to unlock the value that is encapsulated in the data. This lecture provides a detailed overview of the Worldwide LHC Computing Grid, an international collaboration to distribute and analyse the LHC data.

  18. The troposphere-to-stratosphere transition in kinetic energy spectra and nonlinear spectral fluxes as seen in ECMWF analyses

    Science.gov (United States)

    Burgess, A. B. H.; Erler, A. R.; Shepherd, T. G.

    2012-04-01

    We present spectra, nonlinear interaction terms, and fluxes computed for horizontal wind fields from high-resolution meteorological analyses made available by ECMWF for the International Polar Year. Total kinetic energy spectra clearly show two spectral regimes: a steep spectrum at large scales and a shallow spectrum in the mesoscale. The spectral shallowing appears at ~200 hPa, and is due to decreasing rotational power with height, which results in the shallower divergent spectrum dominating in the mesoscale. The spectra we find are steeper than those observed in aircraft data and GCM simulations. Though the analyses resolve total spherical harmonic wavenumbers up to n = 721, effects of dissipation on the fluxes and spectra are visible starting at about n = 200. We find a weak forward energy cascade and a downscale enstrophy cascade in the mesoscale. Eddy-eddy nonlinear kinetic energy transfers reach maximum amplitudes at the tropopause, and decrease with height thereafter; zonal mean-eddy transfers dominate in the stratosphere. In addition, zonal anisotropy reaches a minimum at the tropopause. Combined with strong eddy-eddy interactions, this suggests flow in the tropopause region is very active and bears the greatest resemblance to isotropic turbulence. We find constant enstrophy flux over a broad range of wavenumbers around the tropopause and in the upper stratosphere. A relatively constant spectral enstrophy flux at the tropopause suggests a turbulent inertial range, and that the enstrophy flux is resolved. A main result of our work is its implications for explaining the shallow mesoscale spectrum observed in aircraft wind measurements, GCM studies, and now meteorological analyses. The strong divergent component in the shallow mesoscale spectrum indicates unbalanced flow, and nonlinear transfers decreasing quickly with height are characteristic of waves, not turbulence. Together with the downscale flux of energ y through the shallow spectral range, these

  19. Computed tomographic analyses of water distribution in three porous foam media

    International Nuclear Information System (INIS)

    Brown, J.M.; Fonteno, W.C.; Cassel, D.K.; Johnson, G.A.

    1987-01-01

    The purpose of this paper is to review some of the details of CAT scanning that are of importance to the application of CAT scanning porous media and to evaluate the use of the CAT scanner to measure the spatial distribution of water in three different porous media. The scanner's response to changes in the spatial distribution of water in three different porous phenolic foam materials after draining for 16 h was investigated. Water content distributions were successfully detected with good resolution on the x-ray image. Comparisons of CAT vs. gravimetrically determined water content indicated a significant linear relationship between the methods. Results from these experiments indicate that the CAT scanner can nondestructively measure volume wetness in the phenolic foam media. The clarity of the CAT images suggests that CAT scanning has great potential for studies where small and rapid changes in water content within small volumes of media are desired

  20. Analyses of the eustachian tube and its surrounding tissues with cross sectional images by high-resolution computed tomography (HR-CT)

    International Nuclear Information System (INIS)

    Yoshida, Haruo; Kobayashi, Toshimitsu; Takasaki, Kenji; Kanda, Yukihiko; Nakao, Yoshiaki; Morikawa, Minoru; Ishimaru, Hideki; Hayashi, Kuniaki

    2000-01-01

    We attempted to image the eustachian tube (ET) and its surrounding tissues by high-resolution computed tomography (HR-CT). Twenty-two normal subjects (44 ears) without middle ear problems were studied, and a patient with severe patulous ET was also studied as an abnormal case. In our device of multiplanar reconstruction technique, we were able to obtain the clear reconstructed images of the ET lumen as well as of its surrounding tissues (bone, ET cartilage, tensor veli palatini muscle, levator veli palatini muscle, Ostmann's fat tissue, tensor tympani muscle, internal carotid artery) at any desired portion, either parallel or perpendicular to the long axis of the ET. However, the exact borders between the ET cartilage and the muscles, Ostmann's fat tissue and the tubal gland were not clearly identified. In the severe case of patulous ET, the ET lumen was widely opened at each cross-sectional image from the pharyngeal orifice to the tympanic orifice, in contrast with its being closed at the cartilaginous portion in the normal cases. In addition, the fat tissue and glands around the ET lumen were not clearly identified in this case. We suggest that this method will lead to better understanding of the ET-related diseases such as patulous ET. (author)